979 resultados para Array Analysis


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present study the effect of hot air recirculation is studied with suitable assumptions. It identifies that, the pressure drop across the tile is a dominant parameter which governs the recirculation. The rack suction pressure of the hardware along with the pressure drop across the tile determines the point of recirculation in the cold aisle. The positioning of hardware in the racks play an important role in controlling the recirculation point. The present study is thus helpful in the design of data centre air flow, based on the theory of jets. The air flow can be modelled both quantitatively and qualitatively based on the results

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis, different techniques for image analysis of high density microarrays have been investigated. Most of the existing image analysis techniques require prior knowledge of image specific parameters and direct user intervention for microarray image quantification. The objective of this research work was to develop of a fully automated image analysis method capable of accurately quantifying the intensity information from high density microarrays images. The method should be robust against noise and contaminations that commonly occur in different stages of microarray development.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We examine the efficacy two volume spatial registration of pre and postoperative clinical computed tomography (CT) imaging to verify post-operative electrode array placement in cochlear implant (CI) patients. To measure the degree of accuracy with which the composite image predicts in-vivo placement of the array, we replicate the CI surgical process in cadaver heads. Pre-operative, post-operative, micro CT imaging and histology are utilized for verification.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The automatic transformation of sequential programs for efficient execution on parallel computers involves a number of analyses and restructurings of the input. Some of these analyses are based on computing array sections, a compact description of a range of array elements. Array sections describe the set of array elements that are either read or written by program statements. These sections can be compactly represented using shape descriptors such as regular sections, simple sections, or generalized convex regions. However, binary operations such as Union performed on these representations do not satisfy a straightforward closure property, e.g., if the operands to Union are convex, the result may be nonconvex. Approximations are resorted to in order to satisfy this closure property. These approximations introduce imprecision in the analyses and, furthermore, the imprecisions resulting from successive operations have a cumulative effect. Delayed merging is a technique suggested and used in some of the existing analyses to minimize the effects of approximation. However, this technique does not guarantee an exact solution in a general setting. This article presents a generalized technique to precisely compute Union which can overcome these imprecisions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A fast and robust analytical method for amperometric determination of hydrogen peroxide (H(2)O(2)) based on batch injection analysis (BIA) on an array of gold microelectrodes modified with platinum is proposed. The gold microelectrode array (n = 14) was obtained from electronic chips developed for surface mounted device technology (SMD), whose size offers advantages to adapt them in batch cells. The effect of the dispensing rate, volume injected, distance between the platinum microelectrodes and the pipette tip, as well as the volume of solution in the cell on the analytical response were evaluated. The method allows the H(2)O(2) amperometric determination in the concentration range from 0.8 mu mol L(-1) to 100 mu mol L(-1). The analytical frequency can attain 300 determinations per hour and the detection limit was estimated in 0.34 mu mol L(-1) (3 sigma). The anodic current peaks obtained after a series of 23 successive injections of 50 mu L of 25 mu mol L(-1) H(2)O(2) showed an RSD < 0.9%. To ensure the good selectivity to detect H(2)O(2), its determination was performed in a differential mode, with selective destruction of the H(2)O(2) with catalase in 10 mmol L(-1) phosphate buffer solution. Practical application of the analytical procedure involved H(2)O(2) determination in rainwater of Sao Paulo City. A comparison of the results obtained by the proposed ampermetric method with another one which combines flow injection analysis (FIA) with spectrophotometric detection showed good agreement. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the present work, we report the use of bacterial colonies to optimize macroarray technique. The devised system is significantly cheaper than other methods available to detect large-scale differential gene expression. Recombinant Escherichia coli clones containing plasmid-encoded copies of 4,608 individual expressed sequence tag (ESTs) were robotically spotted onto nylon membranes that were incubated for 6 and 12 h to allow the bacteria to grow and, consequently, amplify the cloned ESTs. The membranes were then hybridized with a beta-lactamase gene specific probe from the recombinant plasmid and, subsequently, phosphorimaged to quantify the microbial cells. Variance analysis demonstrated that the spot hybridization signal intensity was similar for 3,954 ESTs (85.8%) after 6 h of bacterial growth. Membranes spotted with bacteria colonies grown for 12 h had 4,017 ESTs (87.2%) with comparable signal intensity but the signal to noise ratio was fivefold higher. Taken together, the results of this study indicate that it is possible to investigate large-scale gene expression using macroarrays based on bacterial colonies grown for 6 h onto membranes.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The thesis analyses the hydrodynamic induced by an array of Wave energy Converters (WECs), under an experimental and numerical point of view. WECs can be considered an innovative solution able to contribute to the green energy supply and –at the same time– to protect the rear coastal area under marine spatial planning considerations. This research activity essentially rises due to this combined concept. The WEC under exam is a floating device belonging to the Wave Activated Bodies (WAB) class. Experimental data were performed at Aalborg University in different scales and layouts, and the performance of the models was analysed under a variety of irregular wave attacks. The numerical simulations performed with the codes MIKE 21 BW and ANSYS-AQWA. Experimental results were also used to calibrate the numerical parameters and/or to directly been compared to numerical results, in order to extend the experimental database. Results of the research activity are summarized in terms of device performance and guidelines for a future wave farm installation. The device length should be “tuned” based on the local climate conditions. The wave transmission behind the devices is pretty high, suggesting that the tested layout should be considered as a module of a wave farm installation. Indications on the minimum inter-distance among the devices are provided. Furthermore, a CALM mooring system leads to lower wave transmission and also larger power production than a spread mooring. The two numerical codes have different potentialities. The hydrodynamics around single and multiple devices is obtained with MIKE 21 BW, while wave loads and motions for a single moored device are derived from ANSYS-AQWA. Combining the experimental and numerical it is suggested –for both coastal protection and energy production– to adopt a staggered layout, which will maximise the devices density and minimize the marine space required for the installation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DNA sequence copy number has been shown to be associated with cancer development and progression. Array-based Comparative Genomic Hybridization (aCGH) is a recent development that seeks to identify the copy number ratio at large numbers of markers across the genome. Due to experimental and biological variations across chromosomes and across hybridizations, current methods are limited to analyses of single chromosomes. We propose a more powerful approach that borrows strength across chromosomes and across hybridizations. We assume a Gaussian mixture model, with a hidden Markov dependence structure, and with random effects to allow for intertumoral variation, as well as intratumoral clonal variation. For ease of computation, we base estimation on a pseudolikelihood function. The method produces quantitative assessments of the likelihood of genetic alterations at each clone, along with a graphical display for simple visual interpretation. We assess the characteristics of the method through simulation studies and through analysis of a brain tumor aCGH data set. We show that the pseudolikelihood approach is superior to existing methods both in detecting small regions of copy number alteration and in accurately classifying regions of change when intratumoral clonal variation is present.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

El Análisis de Consumo de Recursos o Análisis de Coste trata de aproximar el coste de ejecutar un programa como una función dependiente de sus datos de entrada. A pesar de que existen trabajos previos a esta tesis doctoral que desarrollan potentes marcos para el análisis de coste de programas orientados a objetos, algunos aspectos avanzados, como la eficiencia, la precisión y la fiabilidad de los resultados, todavía deben ser estudiados en profundidad. Esta tesis aborda estos aspectos desde cuatro perspectivas diferentes: (1) Las estructuras de datos compartidas en la memoria del programa son una pesadilla para el análisis estático de programas. Trabajos recientes proponen una serie de condiciones de localidad para poder mantener de forma consistente información sobre los atributos de los objetos almacenados en memoria compartida, reemplazando éstos por variables locales no almacenadas en la memoria compartida. En esta tesis presentamos dos extensiones a estos trabajos: la primera es considerar, no sólo los accesos a los atributos, sino también los accesos a los elementos almacenados en arrays; la segunda se centra en los casos en los que las condiciones de localidad no se cumplen de forma incondicional, para lo cual, proponemos una técnica para encontrar las precondiciones necesarias para garantizar la consistencia de la información acerca de los datos almacenados en memoria. (2) El objetivo del análisis incremental es, dado un programa, los resultados de su análisis y una serie de cambios sobre el programa, obtener los nuevos resultados del análisis de la forma más eficiente posible, evitando reanalizar aquellos fragmentos de código que no se hayan visto afectados por los cambios. Los analizadores actuales todavía leen y analizan el programa completo de forma no incremental. Esta tesis presenta un análisis de coste incremental, que, dado un cambio en el programa, reconstruye la información sobre el coste del programa de todos los métodos afectados por el cambio de forma incremental. Para esto, proponemos (i) un algoritmo multi-dominio y de punto fijo que puede ser utilizado en todos los análisis globales necesarios para inferir el coste, y (ii) una novedosa forma de almacenar las expresiones de coste que nos permite reconstruir de forma incremental únicamente las funciones de coste de aquellos componentes afectados por el cambio. (3) Las garantías de coste obtenidas de forma automática por herramientas de análisis estático no son consideradas totalmente fiables salvo que la implementación de la herramienta o los resultados obtenidos sean verificados formalmente. Llevar a cabo el análisis de estas herramientas es una tarea titánica, ya que se trata de herramientas de gran tamaño y complejidad. En esta tesis nos centramos en el desarrollo de un marco formal para la verificación de las garantías de coste obtenidas por los analizadores en lugar de analizar las herramientas. Hemos implementado esta idea mediante la herramienta COSTA, un analizador de coste para programas Java y KeY, una herramienta de verificación de programas Java. De esta forma, COSTA genera las garantías de coste, mientras que KeY prueba la validez formal de los resultados obtenidos, generando de esta forma garantías de coste verificadas. (4) Hoy en día la concurrencia y los programas distribuidos son clave en el desarrollo de software. Los objetos concurrentes son un modelo de concurrencia asentado para el desarrollo de sistemas concurrentes. En este modelo, los objetos son las unidades de concurrencia y se comunican entre ellos mediante llamadas asíncronas a sus métodos. La distribución de las tareas sugiere que el análisis de coste debe inferir el coste de los diferentes componentes distribuidos por separado. En esta tesis proponemos un análisis de coste sensible a objetos que, utilizando los resultados obtenidos mediante un análisis de apunta-a, mantiene el coste de los diferentes componentes de forma independiente. Abstract Resource Analysis (a.k.a. Cost Analysis) tries to approximate the cost of executing programs as functions on their input data sizes and without actually having to execute the programs. While a powerful resource analysis framework on object-oriented programs existed before this thesis, advanced aspects to improve the efficiency, the accuracy and the reliability of the results of the analysis still need to be further investigated. This thesis tackles this need from the following four different perspectives. (1) Shared mutable data structures are the bane of formal reasoning and static analysis. Analyses which keep track of heap-allocated data are referred to as heap-sensitive. Recent work proposes locality conditions for soundly tracking field accesses by means of ghost non-heap allocated variables. In this thesis we present two extensions to this approach: the first extension is to consider arrays accesses (in addition to object fields), while the second extension focuses on handling cases for which the locality conditions cannot be proven unconditionally by finding aliasing preconditions under which tracking such heap locations is feasible. (2) The aim of incremental analysis is, given a program, its analysis results and a series of changes to the program, to obtain the new analysis results as efficiently as possible and, ideally, without having to (re-)analyze fragments of code that are not affected by the changes. During software development, programs are permanently modified but most analyzers still read and analyze the entire program at once in a non-incremental way. This thesis presents an incremental resource usage analysis which, after a change in the program is made, is able to reconstruct the upper-bounds of all affected methods in an incremental way. To this purpose, we propose (i) a multi-domain incremental fixed-point algorithm which can be used by all global analyses required to infer the cost, and (ii) a novel form of cost summaries that allows us to incrementally reconstruct only those components of cost functions affected by the change. (3) Resource guarantees that are automatically inferred by static analysis tools are generally not considered completely trustworthy, unless the tool implementation or the results are formally verified. Performing full-blown verification of such tools is a daunting task, since they are large and complex. In this thesis we focus on the development of a formal framework for the verification of the resource guarantees obtained by the analyzers, instead of verifying the tools. We have implemented this idea using COSTA, a state-of-the-art cost analyzer for Java programs and KeY, a state-of-the-art verification tool for Java source code. COSTA is able to derive upper-bounds of Java programs while KeY proves the validity of these bounds and provides a certificate. The main contribution of our work is to show that the proposed tools cooperation can be used for automatically producing verified resource guarantees. (4) Distribution and concurrency are today mainstream. Concurrent objects form a well established model for distributed concurrent systems. In this model, objects are the concurrency units that communicate via asynchronous method calls. Distribution suggests that analysis must infer the cost of the diverse distributed components separately. In this thesis we propose a novel object-sensitive cost analysis which, by using the results gathered by a points-to analysis, can keep the cost of the diverse distributed components separate.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

"Dec. 1983."

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The alpha-defensin antimicrobial peptide family is defined by a unique tridisulfide array. To test whether this invariant structural feature determines alpha-defensin bactericidal activity, mouse cryptdin-4 (Crp4) tertiary structure was disrupted by pairs of site-directed Ala for Cys substitutions. In a series of Crp4 disulfide variants whose cysteine connectivities were confirmed using NMR spectroscopy and mass spectrometry, mutagenesis did not induce loss of function. To the contrary, the in vitro bactericidal activities of several Crp4 disulfide variants were equivalent to or greater than those of native Crp4. Mouse Paneth cell alpha-defensins require the proteolytic activation of precursors by matrix metalloproteinase-7 (MMP-7), prompting an analysis of the relative sensitivities of native and mutant Crp4 and proCrp4 molecules to degradation by MMP-7. Although native Crp4 and the alpha-defensin moiety of proCrp4 resisted proteolysis completely, all disulfide variants were degraded extensively by MMP-7. Crp4 bactericidal activity was eliminated by MMP-7 cleavage. Thus, rather than determining alpha-defensin bactericidal activity, the Crp4 disulfide arrangement confers essential protection from degradation by this critical activating proteinase.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This letter presents an analytical model for evaluating the Bit Error Rate (BER) of a Direct Sequence Code Division Multiple Access (DS-CDMA) system, with M-ary orthogonal modulation and noncoherent detection, employing an array antenna operating in a Nakagami fading environment. An expression of the Signal to Interference plus Noise Ratio (SINR) at the output of the receiver is derived, which allows the BER to be evaluated using a closed form expression. The analytical model is validated by comparing the obtained results with simulation results.