962 resultados para Computer algorithms


Relevância:

60.00% 60.00%

Publicador:

Resumo:

El modelatge, visualització i anàlisi de terrenys és de gran importància en els Sistemes d’Informació Geogràfica (GIS). Actualment és de gran interès per a aquesta comunitat disposar de software que permeti analitzar terrenys. L’objectiu principal del projecte és desenvolupar una aplicació per a la resolució de diversos problemes de proximitat en terrenys. Una part important ha d’ésser la de poder generar, visualitzar i modificar un model 3D d’un terreny a partir de dades introduïdes per l’usuari o obtingudes des d’un fitxer. Per tal de poder construir l’aplicació desitjada ha calgut dissenyar una interfície gràfica d’usuari que permetés realitzar de forma interactiva la introducció, modificació i esborrat de les diferents seus (punts, segments, polígons, poligonals...) o restriccions del terreny, així com la seva visualització

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The major outcomes of this research project were the development of a set of decentralized algorithms to index, locate and synchronize replicated information in a networked environment. This study exploits the application specific design constraints of networked systems to improve performance, instead of relying on data structures and algorithms best suited to centralized systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis made outstanding contribution in automating the discovery of linear causal models. It introduced a highly efficient discovery algorithm, which implements new encoding, ensemble and accelerating strategies. Theoretic research and experimental work showed that this new discovery algorithm outperforms the previous system in both accuracy and efficiency.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A complete and highly robust 3D reconstruction algorithm based on stereo vision is presented. The developed system is capable of reconstructing dimensionally accurate 3D models of the objects and is very simple and cost effective due to its prominent software dependency and minimal hardware involvevment unlike existing systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents two novel algorithms for blind chancel equalization (BCE) and blind source separation (BSS). Beside these, a general framework for global convergent analysis is proposed. Finally, the open problem of equalising a non-irreducible system is answered by the algorithm proposed in this thesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis includes the development of an architectural framework for the proposed image to text translation system containing four components. Selection of appropriate algorithms for the first three components developed three effective multi-label classification algorithms for the fourth component, i.e. the translation component, for different problem settings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The thesis investigates various machine learning approaches to reducing data dimensionality, and studies the impact of asymmetric data on learning in image retrieval. Efficient algorithms are proposed to reduce the data dimensionality. Integration strategies for one-class classification are designed to address asymmetric data issue and improve retrieval effectiveness.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Biologia Geral e Aplicada - IBB

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Energia na Agricultura) - FCA

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The wide territorial extension of Brazil derails the installation and maintenance of instruments for measuring solar radiation, which makes necessary the development and application of models that are able to estimate reliable and sufficient data for many different activities that use such data. And these, in most cases, are estimated from the Ångström equation. Based on this model, this project aimed to estimate the global solar radiation at Presidente Prudente-SP, Brazil, using daily data from 1999 to 2007. The solar radiation data have been extracted from the paper tapes of actinograph bi-metallic (Robitsch) daily records at the meteorological station in the Faculty of Science and Technology, UNESP. These tapes were scanned, resulting in digital images with x and y coordinates pairs (x = time; y = solar radiation, cal/min.cm²). The daily global solar radiation is the area under the curve of the image. This value has been calculated by computer algorithms. After the acquisition and calculation of the values needed to develop the Ångström equation have been determined the constants a and b, using linear regression between the values of Rg/R0 (solar radiation/solar radiation on a horizontal surface at the top of atmosphere), as ordered, and n/N (number of hours of sunshine/day length in hours) as abscissa. The slope of the line will be the constant b and the linear coefficient, the constant a. The estimated results were compared to the observed using the Kolmogorov-Smirnov test, realizing that the models can be accepted. So, the equation to aim the solar global radiation is: Rg = R0 (0,2662+0,3592 n/N)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In radiotherapy, computational systems are used for radiation dose determination in the treatment’s volume and radiometric parameters quality analysis of equipment and field irradiated. Due to the increasing technological advancement, several research has been performed in brachytherapy for different computational algorithms development which may be incorporated to treatment planning systems, providing greater accuracy and confidence in the dose calculation. Informatics and information technology fields undergo constant updating and refinement, allowing the use Monte Carlo Method to simulate brachytherapy source dose distribution. The methodology formalization employed to dosimetric analysis is based mainly in the American Association of Physicists in Medicine (AAPM) studies, by Task Group nº 43 (TG-43) and protocols aimed at dosimetry of these radiation sources types. This work aims to analyze the feasibility of using the MCNP-5C (Monte Carlo N-Particle) code to obtain radiometric parameters of brachytherapy sources and so to study the radiation dose variation in the treatment planning. Simulations were performed for the radiation dose variation in the source plan and determined the dosimetric parameters required by TG-43 formalism for the characterization of the two high dose rate iridium-192 sources. The calculated values were compared with the presents in the literature, which were obtained with different Monte Carlo simulations codes. The results showed excellent consistency with the compared codes, enhancing MCNP-5C code the capacity and viability in the sources dosimetry employed in HDR brachytherapy. The method employed may suggest a possible incorporation of this code in the treatment planning systems provided by manufactures together with the equipment, since besides reducing acquisition cost, it can also make the used computational routines more comprehensive, facilitating the brachytherapy ...

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Der Ausheilung von Infektionen mit Leishmania major liegt die Sekretion von IFN- von sowohl CD4+ als auch CD8+ T Zellen zugrunde.rnAktuell konnte in der Literatur nur ein Epitop aus dem parasitären LACK Protein für eine effektive CD4+ T Zell-vermittelte Immunantwort beschrieben werden. Das Ziel der vorliegenden Arbeit bestand daher darin, mögliche MHC I abhängige CD8+ T Zell Antworten zu untersuchen. rnFür diesen Ansatz wurde als erstes der Effekt einer Vakzinierung mit LACK Protein fusioniert an die Protein-Transduktionsdomäne des HIV-1 (TAT) analysiert. Die Effektivität von TAT-LACK gegenüber CD8+ T Zellen wurde mittels in vivo Protein-Vakzinierung von resistenten C57BL/6 Mäusen in Depletions-Experimenten gezeigt.rnDie Prozessierung von Proteinen vor der Präsentation immunogener Peptide gegenüber T Zellen ist unbedingt erforderlich. Daher wurde in dieser Arbeit die Rolle des IFN--induzierbaren Immunoproteasoms bei der Prozessierung von parasitären Proteinen und Präsentation von Peptiden gebunden an MHC I Moleküle durch in vivo und in vitro Experimente untersucht. Es konnte in dieser Arbeit eine Immunoproteasom-unabhängige Prozessierung aufgezeigt werden.rnWeiterhin wurde Parasitenlysat (SLA) von sowohl Promastigoten als auch Amastigoten fraktioniert. In weiterführenden Experimenten können diese Fraktionen auf immunodominante Proteine/Peptide hin untersucht werden. rnLetztlich wurden Epitop-Vorhersagen für CD8+ T Zellen mittels computergestützer Software von beiden parasitären Lebensformen durchgeführt. 300 dieser Epitope wurden synthetisiert und werden in weiterführenden Experimenten zur Charakterisierung immunogener Eigenschaften weiter verwendet. rnIn ihrer Gesamtheit trägt die vorliegende Arbeit wesentlich zum Verständnis über die komplexen Mechanismen der Prozessierung und letztendlich zur Identifikation von möglichen CD8+ T Zell Epitopen bei. Ein detailiertes Verständnis der Prozessierung von CD8+ T Zell Epitopen von Leishmania major über den MHC Klasse I Weg ist von höchster Bedeutung. Die Charakterisierung sowie die Identifikation dieser Peptide wird einen maßgeblichen Einfluss auf die weiteren Entwicklungen von Vakzinen gegen diesen bedeutenden human-pathogenen Parasiten mit sich bringen. rn

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The new knowledge environments of the digital age are oen described as places where we are all closely read, with our buying habits, location, and identities available to advertisers, online merchants, the government, and others through our use of the Internet. This is represented as a loss of privacy in which these entities learn about our activities and desires, using means that were unavailable in the pre-digital era. This article argues that the reciprocal nature of digital networks means 1) that the privacy issues that we face online are not radically different from those of the pre-Internet era, and 2) that we need to reconceive of close reading as an activity of which both humans and computer algorithms are capable.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O desenvolvimento de algoritmos computacionais para a obtenção de distribuições de tamanho de partícula em dispersões e que utilizam dados espectroscópicos em tempo real e in-line a partir de sensores permitirá uma variedade de aplicações, como o monitoramento de propriedades em fluidos de corte industriais, acompanhamento de processos de polimerização, tratamento de efluentes e sensoriamento atmosférico. O presente estudo tem como objetivo a implementação e comparação de técnicas para resolução de problemas de inversão, desenvolvendo algoritmos que forneçam distribuição de tamanho de partículas em dispersões a partir de dados de espectroscopia UV-Vis-Nir (Ultravioleta, Visível e Infravermelho próximo). Foram implementadas quatro técnicas, sendo uma delas um método alternativo sem a presença de etapas de inversão. Os métodos que utilizaram alguma técnica de inversão evidenciaram a dificuldade em se obter distribuições de tamanho de gotas (DTG) de boa qualidade, enquanto o método alternativo foi aquele que se mostrou mais eficiente e confiável. Este estudo é parte de um programa cooperativo entre a Universidade de São Paulo e a Universidade de Bremen chamado programa BRAGECRIM (Brazilian German Cooperative Research Initiative in Manufacturing) e é financiado pela FAPESP, CAPES, FINEP e CNPq (Brasil) e DFG (Alemanha).