893 resultados para Flip-chip design systems
Resumo:
Dissertation presented to obtain the Ph.D degree in Bioinformatics
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
In this paper, a module for homograph disambiguation in Portuguese Text-to-Speech (TTS) is proposed. This module works with a part-of-speech (POS) parser, used to disambiguate homographs that belong to different parts-of-speech, and a semantic analyzer, used to disambiguate homographs which belong to the same part-of-speech. The proposed algorithms are meant to solve a significant part of homograph ambiguity in European Portuguese (EP) (106 homograph pairs so far). This system is ready to be integrated in a Letter-to-Sound (LTS) converter. The algorithms were trained and tested with different corpora. The obtained experimental results gave rise to 97.8% of accuracy rate. This methodology is also valid for Brazilian Portuguese (BP), since 95 homographs pairs are exactly the same as in EP. A comparison with a probabilistic approach was also done and results were discussed.
Resumo:
In a scientific research project is important to define the underlying philosophical orientation of the project, because this will influence the choices made in respect of scientific methods used, as well as the way they will be applied. It is crucial, therefore, that the philosophy and research design strategy are consistent with each other. These questions become even more relevant in qualitative research. Historically, the interpretive research philosophy is more associated to the scientific areas of social sciences and humanities where the subjectivity inherent to human intervention is more explicitly defined. Information systems field are, primarily, trapped in computer science field, though it also integrates issues related with management and organizations field. This shift from a purely technological guidance for the consideration of the problems of management and organizations has fostered the rise of research projects according to the interpretive philosophy and using qualitative methods. This paper explores the importance of alignment between the epistemological orientation and research design strategy, in qualitative research projects. As a result, it is presented two PhD projects, with different research design strategies, that are being developed in the technology and information systems field, in the light of the interpretive paradigm.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
The paper reflects the work of COST Action TU1403 Workgroup 3/Task group 1. The aim is to identify research needs from a review of the state of the art of three aspects related to adaptive façade systems: (1) dynamic performance requirements; (2) façade design under stochastic boundary conditions and (3) experiences with adaptive façade systems and market needs.
Resumo:
Tese de Doutoramento em Biologia Ambiental e Molecular
Resumo:
"Series Title: IFIP - The International Federation for Information Processing, ISSN 1868-4238"
Resumo:
El crecimiento exponencial del tráfico de datos es uno de los mayores desafíos que enfrentan actualmente los sistemas de comunicaciones, debiendo los mismos ser capaces de soportar velocidades de procesamiento de datos cada vez mas altas. En particular, el consumo de potencia se ha transformado en uno de los parámetros de diseño más críticos, generando la necesidad de investigar el uso de nuevas arquitecturas y algoritmos para el procesamiento digital de la información. Por otro lado, el análisis y evaluación de nuevas técnicas de procesamiento presenta dificultades dadas las altas velocidades a las que deben operar, resultando frecuentemente ineficiente el uso de la simulación basada en software como método. En este contexto, el uso de electrónica programable ofrece una oportunidad a bajo costo donde no solo se evaluan nuevas técnicas de diseño de alta velocidad sino también se valida su implementación en desarrollos tecnológicos. El presente proyecto tiene como objetivo principal el estudio y desarrollo de nuevas arquitecturas y algoritmos en electrónica programable para el procesamiento de datos a alta velocidad. El método a utilizar será la programación en dispositivos FPGA (Field-Programmable Gate Array) que ofrecen una buena relación costo-beneficio y gran flexibilidad para integrarse con otros dispositivos de comunicaciones. Para la etapas de diseño, simulación y programación se utilizaran herramientas CAD (Computer-Aided Design) orientadas a sistemas electrónicos digitales. El proyecto beneficiara a estudiantes de grado y postgrado de carreras afines a la informática y las telecomunicaciones, contribuyendo al desarrollo de proyectos finales y tesis doctorales. Los resultados del proyecto serán publicados en conferencias y/o revistas nacionales e internacionales y divulgados a través de charlas de difusión y/o encuentros. El proyecto se enmarca dentro de un área de gran importancia para la Provincia de Córdoba, como lo es la informática y las telecomunicaciones, y promete generar conocimiento de gran valor agregado que pueda ser transferido a empresas tecnológicas de la Provincia de Córdoba a través de consultorias o desarrollos de productos.
Resumo:
Los materiales lignocelulósicos residuales de las actividades agroindustriales pueden ser aprovechados como fuente de lignina, hemicelulosa y celulosa. El tratamiento químico del material lignocelulósico se debe enfrentar al hecho de que dicho material es bastante recalcitrante a tal ataque, fundamentalmente debido a la presencia del polímero lignina. Esto se puede lograr también utilizando hongos de la podredumbre blanca de la madera. Estos producen enzimas lignolíticas extracelulares fundamentalmente Lacasa, que oxida la lignina a CO2. Tambien oxida un amplio rango de sustratos ( fenoles, polifenoles, anilinas, aril-diaminas, fenoles metoxi-sustituídos, y otros), lo cual es una buena razón de su atracción para aplicaciones biotecnológicas. La enzima tiene potencial aplicación en procesos tales como en la delignificación de materiales lignocelulósicos y en el bioblanqueado de pulpas para papel, en el tratamiento de aguas residuales de plantas industriales, en la modificación de fibras y decoloración en industrias textiles y de colorantes, en el mejoramiento de alimentos para animales, en la detoxificación de polutantes y en bioremediación de suelos contaminados. También se la ha utilizado en Q.Orgánica para la oxidación de grupos funcionales, en la formación de enlaces carbono- nitrógeno y en la síntesis de productos naturales complejos. HIPOTESIS: Los hongos de podredumbre blanca, y en condiciones óptimas de cultivo producen distintos tipos de enzimas oxidasas, siendo las lacasas las más adecuadas para explorarlas como catalizadores en los siguientes procesos: Delignificación de residuos de la industria forestal con el fin de aprovechar tales desechos en la alimentación animal. Decontaminación/remediación de suelos y/o efluentes industriales. Se realizarán los estudios para el diseño de bio-reactores que permitan responder a las dos cuestiones planteadas en la hipótesis. Para el proceso de delignificación de material lignocelulósico se proponen dos estrategias: 1- tratar el material con el micelio del hongo adecuando la provisión de nutrientes para un desarrollo sostenido y favorecer la liberación de la enzima. 2- Utilizar la enzima lacasa parcialmente purificada acoplada a un sistema mediador para oxidar los compuestos polifenólicos. Para el proceso de decontaminación/remediación de suelos y/o efluentes industriales se trabajará también en dos frentes: 3) por un lado, se ha descripto que existe una correlación positiva entre la actividad de algunas enzimas presentes en el suelo y la fertilidad. En este sentido se conoce que un sistema enzimático, tentativamente identificado como una lacasa de origen microbiano es responsable de la transformación de compuestos orgánicos en el suelo. La enzima protege al suelo de la acumulación de compuestos orgánicos peligrosos catalizando reacciones que involucran degradación, polimerización e incorporación a complejos del ácido húmico. Se utilizarán suelos incorporados con distintos polutantes(por ej. policlorofenoles ó cloroanilinas.) 4) Se trabajará con efluentes industriales contaminantes (alpechínes y/o el efluente líquido del proceso de desamargado de las aceitunas). The lignocellulosic raw materials of the agroindustrial activities can be taken advantage as source of lignin, hemicellulose and cellulose. The chemical treatment of this material is not easy because the above mentioned material is recalcitrant enough to such an assault, due to the presence of the lignin. This can be achieved also using the white-rot fungi of the wood. It produces extracellular ligninolitic enzymes, fundamentally Laccase, which oxidizes the lignin to CO2. The enzyme has application in such processes as in the delignification of lignocellulosic materials and in the biobleaching of fibers for paper industry, in the treatment of waste water of industrial plants, in the discoloration in textile industries, in the improvement of food for ruminants, in the detoxification of polutants and in bioremediation of contaminated soils. HYPOTHESIS: The white-rot fungi produce different types of enzymes, being the laccases the most adapted to explore them as catalysts in the following processes: Delignification of residues of the forest industry in order to take advantage of such waste in the animal feed. Decontamination of soils and / or waste waters. The studies will be conducted for the design of bio reactors that allow to answer to both questions raised in the hypothesis. For the delignification process of lignocellulosic material they propose two strategies: 1- to treat the material with the fungi 2-to use the partially purified enzyme to oxidize the polyphenolic compounds. For the soil and/or waste water decontamination process, we have: 3- Is know that the enzyme protects to the soil of the accumulation of organic dangerous compounds catalyzing reactions that involve degradation, polymerization and incorporation to complexes of the humic acid. There will be use soils incorporated into different pollutants. 4- We will work with waste waters (alpechins or the green olive debittering effluents.
Resumo:
SUMMARY : Eukaryotic DNA interacts with the nuclear proteins using non-covalent ionic interactions. Proteins can recognize specific nucleotide sequences based on the sterical interactions with the DNA and these specific protein-DNA interactions are the basis for many nuclear processes, e.g. gene transcription, chromosomal replication, and recombination. New technology termed ChIP-Seq has been recently developed for the analysis of protein-DNA interactions on a whole genome scale and it is based on immunoprecipitation of chromatin and high-throughput DNA sequencing procedure. ChIP-Seq is a novel technique with a great potential to replace older techniques for mapping of protein-DNA interactions. In this thesis, we bring some new insights into the ChIP-Seq data analysis. First, we point out to some common and so far unknown artifacts of the method. Sequence tag distribution in the genome does not follow uniform distribution and we have found extreme hot-spots of tag accumulation over specific loci in the human and mouse genomes. These artifactual sequence tags accumulations will create false peaks in every ChIP-Seq dataset and we propose different filtering methods to reduce the number of false positives. Next, we propose random sampling as a powerful analytical tool in the ChIP-Seq data analysis that could be used to infer biological knowledge from the massive ChIP-Seq datasets. We created unbiased random sampling algorithm and we used this methodology to reveal some of the important biological properties of Nuclear Factor I DNA binding proteins. Finally, by analyzing the ChIP-Seq data in detail, we revealed that Nuclear Factor I transcription factors mainly act as activators of transcription, and that they are associated with specific chromatin modifications that are markers of open chromatin. We speculate that NFI factors only interact with the DNA wrapped around the nucleosome. We also found multiple loci that indicate possible chromatin barrier activity of NFI proteins, which could suggest the use of NFI binding sequences as chromatin insulators in biotechnology applications. RESUME : L'ADN des eucaryotes interagit avec les protéines nucléaires par des interactions noncovalentes ioniques. Les protéines peuvent reconnaître les séquences nucléotidiques spécifiques basées sur l'interaction stérique avec l'ADN, et des interactions spécifiques contrôlent de nombreux processus nucléaire, p.ex. transcription du gène, la réplication chromosomique, et la recombinaison. Une nouvelle technologie appelée ChIP-Seq a été récemment développée pour l'analyse des interactions protéine-ADN à l'échelle du génome entier et cette approche est basée sur l'immuno-précipitation de la chromatine et sur la procédure de séquençage de l'ADN à haut débit. La nouvelle approche ChIP-Seq a donc un fort potentiel pour remplacer les anciennes techniques de cartographie des interactions protéine-ADN. Dans cette thèse, nous apportons de nouvelles perspectives dans l'analyse des données ChIP-Seq. Tout d'abord, nous avons identifié des artefacts très communs associés à cette méthode qui étaient jusqu'à présent insoupçonnés. La distribution des séquences dans le génome ne suit pas une distribution uniforme et nous avons constaté des positions extrêmes d'accumulation de séquence à des régions spécifiques, des génomes humains et de la souris. Ces accumulations des séquences artéfactuelles créera de faux pics dans toutes les données ChIP-Seq, et nous proposons différentes méthodes de filtrage pour réduire le nombre de faux positifs. Ensuite, nous proposons un nouvel échantillonnage aléatoire comme un outil puissant d'analyse des données ChIP-Seq, ce qui pourraient augmenter l'acquisition de connaissances biologiques à partir des données ChIP-Seq. Nous avons créé un algorithme d'échantillonnage aléatoire et nous avons utilisé cette méthode pour révéler certaines des propriétés biologiques importantes de protéines liant à l'ADN nommés Facteur Nucléaire I (NFI). Enfin, en analysant en détail les données de ChIP-Seq pour la famille de facteurs de transcription nommés Facteur Nucléaire I, nous avons révélé que ces protéines agissent principalement comme des activateurs de transcription, et qu'elles sont associées à des modifications de la chromatine spécifiques qui sont des marqueurs de la chromatine ouverte. Nous pensons que lés facteurs NFI interagir uniquement avec l'ADN enroulé autour du nucléosome. Nous avons également constaté plusieurs régions génomiques qui indiquent une éventuelle activité de barrière chromatinienne des protéines NFI, ce qui pourrait suggérer l'utilisation de séquences de liaison NFI comme séquences isolatrices dans des applications de la biotechnologie.
Resumo:
The lanthanide binuclear helicate [Eu(2)(L(C2(CO(2)H)))(3)] is coupled to avidin to yield a luminescent bioconjugate EuB1 (Q = 9.3%, tau((5)D(0)) = 2.17 ms). MALDI/TOF mass spectrometry confirms the covalent binding of the Eu chelate and UV-visible spectroscopy allows one to determine a luminophore/protein ratio equal to 3.2. Bio-affinity assays involving the recognition of a mucin-like protein expressed on human breast cancer MCF-7 cells by a biotinylated monoclonal antibody 5D10 to which EuB1 is attached via avidin-biotin coupling demonstrate that (i) avidin activity is little affected by the coupling reaction and (ii) detection limits obtained by time-resolved (TR) luminescence with EuB1 and a commercial Eu-avidin conjugate are one order of magnitude lower than those of an organic conjugate (FITC-streptavidin). In the second part of the paper, conditions for growing MCF-7 cells in 100-200 microm wide microchannels engraved in PDMS are established; we demonstrate that EuB1 can be applied as effectively on this lab-on-a-chip device for the detection of tumour-associated antigens as on MCF-7 cells grown in normal culture vials. In order to exploit the versatility of the ligand used for self-assembling [Ln(2)(L(C2(CO(2)H)))(3)] helicates, which sensitizes the luminescence of both Eu(III) and Tb(III) ions, a dual on-chip assay is proposed in which estrogen receptors (ERs) and human epidermal growth factor receptors (Her2/neu) can be simultaneously detected on human breast cancer tissue sections. The Ln helicates are coupled to two secondary antibodies: ERs are visualized by red-emitting EuB4 using goat anti-mouse IgG and Her2/neu receptors by green-emitting TbB5 using goat anti-rabbit IgG. The fact that the assay is more than 6 times faster and requires 5 times less reactants than conventional immunohistochemical assays provides essential advantages over conventional immunohistochemistry for future clinical biomarker detection.
Resumo:
A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image processing. This paper provides the theoretical background and technical information for performing the experiment. The proposed activity requires students able to develop a wide range of skills since they are expected to deal with optical components, including spatial light modulators, and develop scripts to perform some calculations.
Resumo:
For years, specifications have focused on the water to cement ratio (w/cm) and strength of concrete, despite the majority of the volume of a concrete mixture consisting of aggregate. An aggregate distribution of roughly 60% coarse aggregate and 40% fine aggregate, regardless of gradation and availability of aggregates, has been used as the norm for a concrete pavement mixture. Efforts to reduce the costs and improve sustainability of concrete mixtures have pushed owners to pay closer attention to mixtures with a well-graded aggregate particle distribution. In general, workability has many different variables that are independent of gradation, such as paste volume and viscosity, aggregate’s shape, and texture. A better understanding of how the properties of aggregates affect the workability of concrete is needed. The effects of aggregate characteristics on concrete properties, such as ability to be vibrated, strength, and resistivity, were investigated using mixtures in which the paste content and the w/cm were held constant. The results showed the different aggregate proportions, the maximum nominal aggregate sizes, and combinations of different aggregates all had an impact on the performance in the strength, slump, and box test.
Resumo:
There is a concern that agriculture will no longer be able to meet, on a global scale, the growing demand for food. Facing such a challenge requires new patterns of thinking in the context of complexity and sustainability sciences. This paper, focused on the social dimension of the study and management of agricultural systems, suggests that rethinking the study of agricultural systems entails analyzing them as complex socio-ecological systems, as well as considering the differing thinking patterns of diverse stakeholders. The intersubjective nature of knowledge, as studied by different philosophical schools, needs to be better integrated into the study and management of agricultural systems than it is done so far, forcing us to accept that there are no simplistic solutions, and to seek a better understanding of the social dimension of agriculture. Different agriculture related problems require different policy and institutional approaches. Finally, the intersubjective nature of knowledge asks for the visualization of different framings and the power relations taking place in the decision-making process. Rethinking management of agricultural systems implies that policy making should be shaped by different principles: learning, flexibility, adaptation, scale-matching, participation, diversity enhancement and precaution hold the promise to significantly improve current standard management procedures.