935 resultados para Computer algorithms.


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The wide territorial extension of Brazil derails the installation and maintenance of instruments for measuring solar radiation, which makes necessary the development and application of models that are able to estimate reliable and sufficient data for many different activities that use such data. And these, in most cases, are estimated from the Ångström equation. Based on this model, this project aimed to estimate the global solar radiation at Presidente Prudente-SP, Brazil, using daily data from 1999 to 2007. The solar radiation data have been extracted from the paper tapes of actinograph bi-metallic (Robitsch) daily records at the meteorological station in the Faculty of Science and Technology, UNESP. These tapes were scanned, resulting in digital images with x and y coordinates pairs (x = time; y = solar radiation, cal/min.cm²). The daily global solar radiation is the area under the curve of the image. This value has been calculated by computer algorithms. After the acquisition and calculation of the values needed to develop the Ångström equation have been determined the constants a and b, using linear regression between the values of Rg/R0 (solar radiation/solar radiation on a horizontal surface at the top of atmosphere), as ordered, and n/N (number of hours of sunshine/day length in hours) as abscissa. The slope of the line will be the constant b and the linear coefficient, the constant a. The estimated results were compared to the observed using the Kolmogorov-Smirnov test, realizing that the models can be accepted. So, the equation to aim the solar global radiation is: Rg = R0 (0,2662+0,3592 n/N)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In radiotherapy, computational systems are used for radiation dose determination in the treatment’s volume and radiometric parameters quality analysis of equipment and field irradiated. Due to the increasing technological advancement, several research has been performed in brachytherapy for different computational algorithms development which may be incorporated to treatment planning systems, providing greater accuracy and confidence in the dose calculation. Informatics and information technology fields undergo constant updating and refinement, allowing the use Monte Carlo Method to simulate brachytherapy source dose distribution. The methodology formalization employed to dosimetric analysis is based mainly in the American Association of Physicists in Medicine (AAPM) studies, by Task Group nº 43 (TG-43) and protocols aimed at dosimetry of these radiation sources types. This work aims to analyze the feasibility of using the MCNP-5C (Monte Carlo N-Particle) code to obtain radiometric parameters of brachytherapy sources and so to study the radiation dose variation in the treatment planning. Simulations were performed for the radiation dose variation in the source plan and determined the dosimetric parameters required by TG-43 formalism for the characterization of the two high dose rate iridium-192 sources. The calculated values were compared with the presents in the literature, which were obtained with different Monte Carlo simulations codes. The results showed excellent consistency with the compared codes, enhancing MCNP-5C code the capacity and viability in the sources dosimetry employed in HDR brachytherapy. The method employed may suggest a possible incorporation of this code in the treatment planning systems provided by manufactures together with the equipment, since besides reducing acquisition cost, it can also make the used computational routines more comprehensive, facilitating the brachytherapy ...

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Der Ausheilung von Infektionen mit Leishmania major liegt die Sekretion von IFN- von sowohl CD4+ als auch CD8+ T Zellen zugrunde.rnAktuell konnte in der Literatur nur ein Epitop aus dem parasitären LACK Protein für eine effektive CD4+ T Zell-vermittelte Immunantwort beschrieben werden. Das Ziel der vorliegenden Arbeit bestand daher darin, mögliche MHC I abhängige CD8+ T Zell Antworten zu untersuchen. rnFür diesen Ansatz wurde als erstes der Effekt einer Vakzinierung mit LACK Protein fusioniert an die Protein-Transduktionsdomäne des HIV-1 (TAT) analysiert. Die Effektivität von TAT-LACK gegenüber CD8+ T Zellen wurde mittels in vivo Protein-Vakzinierung von resistenten C57BL/6 Mäusen in Depletions-Experimenten gezeigt.rnDie Prozessierung von Proteinen vor der Präsentation immunogener Peptide gegenüber T Zellen ist unbedingt erforderlich. Daher wurde in dieser Arbeit die Rolle des IFN--induzierbaren Immunoproteasoms bei der Prozessierung von parasitären Proteinen und Präsentation von Peptiden gebunden an MHC I Moleküle durch in vivo und in vitro Experimente untersucht. Es konnte in dieser Arbeit eine Immunoproteasom-unabhängige Prozessierung aufgezeigt werden.rnWeiterhin wurde Parasitenlysat (SLA) von sowohl Promastigoten als auch Amastigoten fraktioniert. In weiterführenden Experimenten können diese Fraktionen auf immunodominante Proteine/Peptide hin untersucht werden. rnLetztlich wurden Epitop-Vorhersagen für CD8+ T Zellen mittels computergestützer Software von beiden parasitären Lebensformen durchgeführt. 300 dieser Epitope wurden synthetisiert und werden in weiterführenden Experimenten zur Charakterisierung immunogener Eigenschaften weiter verwendet. rnIn ihrer Gesamtheit trägt die vorliegende Arbeit wesentlich zum Verständnis über die komplexen Mechanismen der Prozessierung und letztendlich zur Identifikation von möglichen CD8+ T Zell Epitopen bei. Ein detailiertes Verständnis der Prozessierung von CD8+ T Zell Epitopen von Leishmania major über den MHC Klasse I Weg ist von höchster Bedeutung. Die Charakterisierung sowie die Identifikation dieser Peptide wird einen maßgeblichen Einfluss auf die weiteren Entwicklungen von Vakzinen gegen diesen bedeutenden human-pathogenen Parasiten mit sich bringen. rn

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The new knowledge environments of the digital age are oen described as places where we are all closely read, with our buying habits, location, and identities available to advertisers, online merchants, the government, and others through our use of the Internet. This is represented as a loss of privacy in which these entities learn about our activities and desires, using means that were unavailable in the pre-digital era. This article argues that the reciprocal nature of digital networks means 1) that the privacy issues that we face online are not radically different from those of the pre-Internet era, and 2) that we need to reconceive of close reading as an activity of which both humans and computer algorithms are capable.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O desenvolvimento de algoritmos computacionais para a obtenção de distribuições de tamanho de partícula em dispersões e que utilizam dados espectroscópicos em tempo real e in-line a partir de sensores permitirá uma variedade de aplicações, como o monitoramento de propriedades em fluidos de corte industriais, acompanhamento de processos de polimerização, tratamento de efluentes e sensoriamento atmosférico. O presente estudo tem como objetivo a implementação e comparação de técnicas para resolução de problemas de inversão, desenvolvendo algoritmos que forneçam distribuição de tamanho de partículas em dispersões a partir de dados de espectroscopia UV-Vis-Nir (Ultravioleta, Visível e Infravermelho próximo). Foram implementadas quatro técnicas, sendo uma delas um método alternativo sem a presença de etapas de inversão. Os métodos que utilizaram alguma técnica de inversão evidenciaram a dificuldade em se obter distribuições de tamanho de gotas (DTG) de boa qualidade, enquanto o método alternativo foi aquele que se mostrou mais eficiente e confiável. Este estudo é parte de um programa cooperativo entre a Universidade de São Paulo e a Universidade de Bremen chamado programa BRAGECRIM (Brazilian German Cooperative Research Initiative in Manufacturing) e é financiado pela FAPESP, CAPES, FINEP e CNPq (Brasil) e DFG (Alemanha).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

"September 7, 1971."

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this work is to demonstrate and to assess a simple algorithm for automatic estimation of the most salient region in an image, that have possible application in computer vision. The algorithm uses the connection between color dissimilarities in the image and the image’s most salient region. The algorithm also avoids using image priors. Pixel dissimilarity is an informal function of the distance of a specific pixel’s color to other pixels’ colors in an image. We examine the relation between pixel color dissimilarity and salient region detection on the MSRA1K image dataset. We propose a simple algorithm for salient region detection through random pixel color dissimilarity. We define dissimilarity by accumulating the distance between each pixel and a sample of n other random pixels, in the CIELAB color space. An important result is that random dissimilarity between each pixel and just another pixel (n = 1) is enough to create adequate saliency maps when combined with median filter, with competitive average performance if compared with other related methods in the saliency detection research field. The assessment was performed by means of precision-recall curves. This idea is inspired on the human attention mechanism that is able to choose few specific regions to focus on, a biological system that the computer vision community aims to emulate. We also review some of the history on this topic of selective attention.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Printed Circuit Board (PCB) layout design is one of the most important and time consuming phases during equipment design process in all electronic industries. This paper is concerned with the development and implementation of a computer aided PCB design package. A set of programs which operate on a description of the circuit supplied by the user in the form of a data file and subsequently design the layout of a double-sided PCB has been developed. The algorithms used for the design of the PCB optimise the board area and the length of copper tracks used for the interconnections. The output of the package is the layout drawing of the PCB, drawn on a CALCOMP hard copy plotter and a Tektronix 4012 storage graphics display terminal. The routing density (the board area required for one component) achieved by this package is typically 0.8 sq. inch per IC. The package is implemented on a DEC 1090 system in Pascal and FORTRAN and SIGN(1) graphics package is used for display generation.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis aimed at addressing some of the issues that, at the state of the art, avoid the P300-based brain computer interface (BCI) systems to move from research laboratories to end users’ home. An innovative asynchronous classifier has been defined and validated. It relies on the introduction of a set of thresholds in the classifier, and such thresholds have been assessed considering the distributions of score values relating to target, non-target stimuli and epochs of voluntary no-control. With the asynchronous classifier, a P300-based BCI system can adapt its speed to the current state of the user and can automatically suspend the control when the user diverts his attention from the stimulation interface. Since EEG signals are non-stationary and show inherent variability, in order to make long-term use of BCI possible, it is important to track changes in ongoing EEG activity and to adapt BCI model parameters accordingly. To this aim, the asynchronous classifier has been subsequently improved by introducing a self-calibration algorithm for the continuous and unsupervised recalibration of the subjective control parameters. Finally an index for the online monitoring of the EEG quality has been defined and validated in order to detect potential problems and system failures. This thesis ends with the description of a translational work involving end users (people with amyotrophic lateral sclerosis-ALS). Focusing on the concepts of the user centered design approach, the phases relating to the design, the development and the validation of an innovative assistive device have been described. The proposed assistive technology (AT) has been specifically designed to meet the needs of people with ALS during the different phases of the disease (i.e. the degree of motor abilities impairment). Indeed, the AT can be accessed with several input devices either conventional (mouse, touchscreen) or alterative (switches, headtracker) up to a P300-based BCI.