994 resultados para Correlation algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction New evidence from randomized controlled and etiology of fever studies, the availability of reliable RDT for malaria, and novel technologies call for revision of the IMCI strategy. We developed a new algorithm based on (i) a systematic review of published studies assessing the safety and appropriateness of RDT and antibiotic prescription, (ii) results from a clinical and microbiological investigation of febrile children aged <5 years, (iii) international expert IMCI opinions. The aim of this study was to assess the safety of the new algorithm among patients in urban and rural areas of Tanzania.Materials and Methods The design was a controlled noninferiority study. Enrolled children aged 2-59 months with any illness were managed either by a study clinician using the new Almanach algorithm (two intervention health facilities), or clinicians using standard practice, including RDT (two control HF). At day 7 and day 14, all patients were reassessed. Patients who were ill in between or not cured at day 14 were followed until recovery or death. Primary outcome was rate of complications, secondary outcome rate of antibiotic prescriptions.Results 1062 children were recruited. Main diagnoses were URTI 26%, pneumonia 19% and gastroenteritis (9.4%). 98% (531/541) were cured at D14 in the Almanach arm and 99.6% (519/521) in controls. Rate of secondary hospitalization was 0.2% in each. One death occurred in controls. None of the complications was due to withdrawal of antibiotics or antimalarials at day 0. Rate of antibiotic use was 19% in the Almanach arm and 84% in controls.Conclusion Evidence suggests that the new algorithm, primarily aimed at the rational use of drugs, is as safe as standard practice and leads to a drastic reduction of antibiotic use. The Almanach is currently being tested for clinician adherence to proposed procedures when used on paper or a mobile phone

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabajo presenta un Algoritmo Genético (GA) del problema de secuenciar unidades en una línea de producción. Se tiene en cuenta la posibilidad de cambiar la secuencia de piezas mediante estaciones con acceso a un almacén intermedio o centralizado. El acceso al almacén además está restringido, debido al tamaño de las piezas.AbstractThis paper presents a Genetic Algorithm (GA) for the problem of sequencing in a mixed model non-permutation flowshop. Resequencingis permitted where stations have access to intermittent or centralized resequencing buffers. The access to a buffer is restricted by the number of available buffer places and the physical size of the products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract One of the most important issues in molecular biology is to understand regulatory mechanisms that control gene expression. Gene expression is often regulated by proteins, called transcription factors which bind to short (5 to 20 base pairs),degenerate segments of DNA. Experimental efforts towards understanding the sequence specificity of transcription factors is laborious and expensive, but can be substantially accelerated with the use of computational predictions. This thesis describes the use of algorithms and resources for transcriptionfactor binding site analysis in addressing quantitative modelling, where probabilitic models are built to represent binding properties of a transcription factor and can be used to find new functional binding sites in genomes. Initially, an open-access database(HTPSELEX) was created, holding high quality binding sequences for two eukaryotic families of transcription factors namely CTF/NF1 and LEFT/TCF. The binding sequences were elucidated using a recently described experimental procedure called HTP-SELEX, that allows generation of large number (> 1000) of binding sites using mass sequencing technology. For each HTP-SELEX experiments we also provide accurate primary experimental information about the protein material used, details of the wet lab protocol, an archive of sequencing trace files, and assembled clone sequences of binding sequences. The database also offers reasonably large SELEX libraries obtained with conventional low-throughput protocols.The database is available at http://wwwisrec.isb-sib.ch/htpselex/ and and ftp://ftp.isrec.isb-sib.ch/pub/databases/htpselex. The Expectation-Maximisation(EM) algorithm is one the frequently used methods to estimate probabilistic models to represent the sequence specificity of transcription factors. We present computer simulations in order to estimate the precision of EM estimated models as a function of data set parameters(like length of initial sequences, number of initial sequences, percentage of nonbinding sequences). We observed a remarkable robustness of the EM algorithm with regard to length of training sequences and the degree of contamination. The HTPSELEX database and the benchmarked results of the EM algorithm formed part of the foundation for the subsequent project, where a statistical framework called hidden Markov model has been developed to represent sequence specificity of the transcription factors CTF/NF1 and LEF1/TCF using the HTP-SELEX experiment data. The hidden Markov model framework is capable of both predicting and classifying CTF/NF1 and LEF1/TCF binding sites. A covariance analysis of the binding sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism. We next tested the LEF1/TCF model by computing binding scores for a set of LEF1/TCF binding sequences for which relative affinities were determined experimentally using non-linear regression. The predicted and experimentally determined binding affinities were in good correlation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the estimation of the code-phase(pseudorange) and the carrier-phase of the direct signal received from a direct-sequence spread-spectrum satellite transmitter. Thesignal is received by an antenna array in a scenario with interferenceand multipath propagation. These two effects are generallythe limiting error sources in most high-precision positioning applications.A new estimator of the code- and carrier-phases is derivedby using a simplified signal model and the maximum likelihood(ML) principle. The simplified model consists essentially ofgathering all signals, except for the direct one, in a component withunknown spatial correlation. The estimator exploits the knowledgeof the direction-of-arrival of the direct signal and is much simplerthan other estimators derived under more detailed signal models.Moreover, we present an iterative algorithm, that is adequate for apractical implementation and explores an interesting link betweenthe ML estimator and a hybrid beamformer. The mean squarederror and bias of the new estimator are computed for a numberof scenarios and compared with those of other methods. The presentedestimator and the hybrid beamforming outperform the existingtechniques of comparable complexity and attains, in manysituations, the Cramér–Rao lower bound of the problem at hand.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Localization, which is the ability of a mobile robot to estimate its position within its environment, is a key capability for autonomous operation of any mobile robot. This thesis presents a system for indoor coarse and global localization of a mobile robot based on visual information. The system is based on image matching and uses SIFT features as natural landmarks. Features extracted from training images arestored in a database for use in localization later. During localization an image of the scene is captured using the on-board camera of the robot, features are extracted from the image and the best match is searched from the database. Feature matching is done using the k-d tree algorithm. Experimental results showed that localization accuracy increases with the number of training features used in the training database, while, on the other hand, increasing number of features tended to have a negative impact on the computational time. For some parts of the environment the error rate was relatively high due to a strong correlation of features taken from those places across the environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To evaluate the impact of noninvasive ventilation (NIV) algorithms available on intensive care unit ventilators on the incidence of patient-ventilator asynchrony in patients receiving NIV for acute respiratory failure. Prospective multicenter randomized cross-over study. Intensive care units in three university hospitals. Patients consecutively admitted to the ICU and treated by NIV with an ICU ventilator were included. Airway pressure, flow and surface diaphragmatic electromyography were recorded continuously during two 30-min periods, with the NIV (NIV+) or without the NIV algorithm (NIV0). Asynchrony events, the asynchrony index (AI) and a specific asynchrony index influenced by leaks (AIleaks) were determined from tracing analysis. Sixty-five patients were included. With and without the NIV algorithm, respectively, auto-triggering was present in 14 (22%) and 10 (15%) patients, ineffective breaths in 15 (23%) and 5 (8%) (p = 0.004), late cycling in 11 (17%) and 5 (8%) (p = 0.003), premature cycling in 22 (34%) and 21 (32%), and double triggering in 3 (5%) and 6 (9%). The mean number of asynchronies influenced by leaks was significantly reduced by the NIV algorithm (p < 0.05). A significant correlation was found between the magnitude of leaks and AIleaks when the NIV algorithm was not activated (p = 0.03). The global AI remained unchanged, mainly because on some ventilators with the NIV algorithm premature cycling occurs. In acute respiratory failure, NIV algorithms provided by ICU ventilators can reduce the incidence of asynchronies because of leaks, thus confirming bench test results, but some of these algorithms can generate premature cycling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Paperin pinnan karheus on yksi paperin laatukriteereistä. Sitä mitataan fyysisestipaperin pintaa mittaavien laitteiden ja optisten laitteiden avulla. Mittaukset vaativat laboratorioolosuhteita, mutta nopeammille, suoraan linjalla tapahtuville mittauksilla olisi tarvetta paperiteollisuudessa. Paperin pinnan karheus voidaan ilmaista yhtenä näytteelle kohdistuvana karheusarvona. Tässä työssä näyte on jaettu merkitseviin alueisiin, ja jokaiselle alueelle on laskettu erillinen karheusarvo. Karheuden mittaukseen on käytetty useita menetelmiä. Yleisesti hyväksyttyä tilastollista menetelmää on käytetty tässä työssä etäisyysmuunnoksen lisäksi. Paperin pinnan karheudenmittauksessa on ollut tarvetta jakaa analysoitava näyte karheuden perusteella alueisiin. Aluejaon avulla voidaan rajata näytteestä selvästi karheampana esiintyvät alueet. Etäisyysmuunnos tuottaa alueita, joita on analysoitu. Näistä alueista on muodostettu yhtenäisiä alueita erilaisilla segmentointimenetelmillä. PNN -menetelmään (Pairwise Nearest Neighbor) ja naapurialueiden yhdistämiseen perustuvia algoritmeja on käytetty.Alueiden jakamiseen ja yhdistämiseen perustuvaa lähestymistapaa on myös tarkasteltu. Segmentoitujen kuvien validointi on yleensä tapahtunut ihmisen tarkastelemana. Tämän työn lähestymistapa on verrata yleisesti hyväksyttyä tilastollista menetelmää segmentoinnin tuloksiin. Korkea korrelaatio näiden tulosten välillä osoittaa onnistunutta segmentointia. Eri kokeiden tuloksia on verrattu keskenään hypoteesin testauksella. Työssä on analysoitu kahta näytesarjaa, joidenmittaukset on suoritettu OptiTopolla ja profilometrillä. Etäisyysmuunnoksen aloitusparametrit, joita muutettiin kokeiden aikana, olivat aloituspisteiden määrä ja sijainti. Samat parametrimuutokset tehtiin kaikille algoritmeille, joita käytettiin alueiden yhdistämiseen. Etäisyysmuunnoksen jälkeen korrelaatio oli voimakkaampaa profilometrillä mitatuille näytteille kuin OptiTopolla mitatuille näytteille. Segmentoiduilla OptiTopo -näytteillä korrelaatio parantui voimakkaammin kuin profilometrinäytteillä. PNN -menetelmän tuottamilla tuloksilla korrelaatio oli paras.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissolved organic matter (DOM) is a complex mixture of organic compounds, ubiquitous in marine and freshwater systems. Fluorescence spectroscopy, by means of Excitation-Emission Matrices (EEM), has become an indispensable tool to study DOM sources, transport and fate in aquatic ecosystems. However the statistical treatment of large and heterogeneous EEM data sets still represents an important challenge for biogeochemists. Recently, Self-Organising Maps (SOM) has been proposed as a tool to explore patterns in large EEM data sets. SOM is a pattern recognition method which clusterizes and reduces the dimensionality of input EEMs without relying on any assumption about the data structure. In this paper, we show how SOM, coupled with a correlation analysis of the component planes, can be used both to explore patterns among samples, as well as to identify individual fluorescence components. We analysed a large and heterogeneous EEM data set, including samples from a river catchment collected under a range of hydrological conditions, along a 60-km downstream gradient, and under the influence of different degrees of anthropogenic impact. According to our results, chemical industry effluents appeared to have unique and distinctive spectral characteristics. On the other hand, river samples collected under flash flood conditions showed homogeneous EEM shapes. The correlation analysis of the component planes suggested the presence of four fluorescence components, consistent with DOM components previously described in the literature. A remarkable strength of this methodology was that outlier samples appeared naturally integrated in the analysis. We conclude that SOM coupled with a correlation analysis procedure is a promising tool for studying large and heterogeneous EEM data sets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Eight patients with colloid cysts of the third ventricle were examined with CT and MR. In six, surgical resection was performed and the material was subjected to histologic evaluation; the concentrations of trace elements were determined by particle-induced X-ray emission. Stereotaxic aspiration was performed in two. The investigation showed that colloid cysts are often iso- or hypodense relative to brain on CT (5/8), but sometimes have a center of increased density. Increased density did not correlate with increased concentration of calcium or other metals but did not correlate with high cholesterol content. Colloid cysts appear more heterogeneous on MR (6/8) than on CT (3/8), despite a homogeneous appearance at histology. High signal on short TR/TE sequences is correlated with a high cholesterol content. A marked shortening of the T2 relaxation time is often noticed in the central part of the cyst. Analysis of trace elements showed that this phenomenon is not related to the presence of metals with paramagnetic effects. Our analysis of the contents of colloid cysts does not support the theory that differing metallic concentrations are responsible for differences in MR signal intensity or CT density. We did find that increased CT density and high MR signal correlated with high cholesterol content.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adaptació de l'algorisme de Kumar per resoldre sistemes d'equacions amb matrius de Toeplitz sobre els reals a cossos finits en un temps 0 (n log n).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La principal motivació d'aquest treball ha estat implementar l'algoritme Rijndael-AES en un full Sage-math, paquet de software matemàtic de lliure distribució i en actual desenvolupament, aprofitant les seves eines i funcionalitats integrades.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The parameter setting of a differential evolution algorithm must meet several requirements: efficiency, effectiveness, and reliability. Problems vary. The solution of a particular problem can be represented in different ways. An algorithm most efficient in dealing with a particular representation may be less efficient in dealing with other representations. The development of differential evolution-based methods contributes substantially to research on evolutionary computing and global optimization in general. The objective of this study is to investigatethe differential evolution algorithm, the intelligent adjustment of its controlparameters, and its application. In the thesis, the differential evolution algorithm is first examined using different parameter settings and test functions. Fuzzy control is then employed to make control parameters adaptive based on an optimization process and expert knowledge. The developed algorithms are applied to training radial basis function networks for function approximation with possible variables including centers, widths, and weights of basis functions and both having control parameters kept fixed and adjusted by fuzzy controller. After the influence of control variables on the performance of the differential evolution algorithm was explored, an adaptive version of the differential evolution algorithm was developed and the differential evolution-based radial basis function network training approaches were proposed. Experimental results showed that the performance of the differential evolution algorithm is sensitive to parameter setting, and the best setting was found to be problem dependent. The fuzzy adaptive differential evolution algorithm releases the user load of parameter setting and performs better than those using all fixedparameters. Differential evolution-based approaches are effective for training Gaussian radial basis function networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the variability of bond strength test results of adhesive systems (AS) and to correlate the results with clinical parameters of clinical studies investigating cervical restorations. MATERIALS AND METHODS: Regarding the clinical studies, the internal database which had previously been used for a meta-analysis on cervical restorations was updated with clinical studies published between 2008 and 2012 by searching the PubMed and SCOPUS databases. PubMed and the International Association for Dental Research abstracts online were searched for laboratory studies on microtensile, macrotensile and macroshear bond strength tests. The inclusion criteria were (1) dentin, (2) testing of at least four adhesive systems, (3) same diameter of composite and (4) 24h of water storage prior to testing. The clinical outcome variables were retention loss, marginal discoloration, detectable margins, and a clinical index comprising the three parameters by weighing them. Linear mixed models which included a random study effect were calculated for both, the laboratory and the clinical studies. The variability was assessed by calculating a ratio of variances, dividing the variance among the estimated bonding effects obtained in the linear mixed models by the sum of all variance components estimated in these models. RESULTS: Thirty-two laboratory studies fulfilled the inclusion criteria comprising 183 experiments. Of those, 86 used the microtensile test evaluating 22 adhesive systems (AS). Twenty-seven used the macrotensile test with 17 AS, and 70 used the macroshear test with 24 AS. For 28 AS the results from clinical studies were available. Microtensile and macrotensile (Spearman rho=0.66, p=0.007) were moderately correlated and also microtensile and macroshear (Spearman rho=0.51, p=0.03) but not macroshear and macrotensile (Spearman rho=0.34, p=0.22). The effect of the adhesive system was significant for microtensile and macroshear (p<0.001) but not for macrotensile. The effect of the adhesive system could explain 36% of the variability of the microtensile test, 27% of the macrotensile and 33% of the macroshear test. For the clinical trials, about 49% of the variability of retained restorations could be explained by the adhesive system. With respect to the correlation between bond strength tests and clinical parameters, only a moderate correlation between micro- and macrotensile test results and marginal discoloration was demonstrated. However, no correlation between these tests and a retention loss or marginal integrity was shown. The correlation improved when more studies were included compared to assessing only one study. SIGNIFICANCE: The high variability of bond strength test results highlights the need to establish individual acceptance levels for a given test institute. The weak correlation of bond-strength test results with clinical parameters leads to the conclusion that one should not rely solely on bond strength tests to predict the clinical performance of an adhesive system but one should conduct other laboratory tests like tests on the marginal adaptation of fillings in extracted teeth and the retention loss of restorations in non-retentive cavities after artificial aging.