811 resultados para Array algorithm
Resumo:
In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
Essential hypertension is a multifactorial disorder and is the main risk factor for renal and cardiovascular complications. The research on the genetics of hypertension has been frustrated by the small predictive value of the discovered genetic variants. The HYPERGENES Project investigated associations between genetic variants and essential hypertension pursuing a 2-stage study by recruiting cases and controls from extensively characterized cohorts recruited over many years in different European regions. The discovery phase consisted of 1865 cases and 1750 controls genotyped with 1M Illumina array. Best hits were followed up in a validation panel of 1385 cases and 1246 controls that were genotyped with a custom array of 14 055 markers. We identified a new hypertension susceptibility locus (rs3918226) in the promoter region of the endothelial NO synthase gene (odds ratio: 1.54 [95% CI: 1.37-1.73]; combined P=2.58 · 10(-13)). A meta-analysis, using other in silico/de novo genotyping data for a total of 21 714 subjects, resulted in an overall odds ratio of 1.34 (95% CI: 1.25-1.44; P=1.032 · 10(-14)). The quantitative analysis on a population-based sample revealed an effect size of 1.91 (95% CI: 0.16-3.66) for systolic and 1.40 (95% CI: 0.25-2.55) for diastolic blood pressure. We identified in silico a potential binding site for ETS transcription factors directly next to rs3918226, suggesting a potential modulation of endothelial NO synthase expression. Biological evidence links endothelial NO synthase with hypertension, because it is a critical mediator of cardiovascular homeostasis and blood pressure control via vascular tone regulation. This finding supports the hypothesis that there may be a causal genetic variation at this locus.
Resumo:
This paper compares two well known scan matching algorithms: the MbICP and the pIC. As a result of the study, it is proposed the MSISpIC, a probabilistic scan matching algorithm for the localization of an Autonomous Underwater Vehicle (AUV). The technique uses range scans gathered with a Mechanical Scanning Imaging Sonar (MSIS), and the robot displacement estimated through dead-reckoning with the help of a Doppler Velocity Log (DVL) and a Motion Reference Unit (MRU). The proposed method is an extension of the pIC algorithm. Its major contribution consists in: 1) using an EKF to estimate the local path traveled by the robot while grabbing the scan as well as its uncertainty and 2) proposing a method to group into a unique scan, with a convenient uncertainty model, all the data grabbed along the path described by the robot. The algorithm has been tested on an AUV guided along a 600m path within a marina environment with satisfactory results
Resumo:
Nominal Unification is an extension of first-order unification where terms can contain binders and unification is performed modulo α equivalence. Here we prove that the existence of nominal unifiers can be decided in quadratic time. First, we linearly-reduce nominal unification problems to a sequence of freshness and equalities between atoms, modulo a permutation, using ideas as Paterson and Wegman for first-order unification. Second, we prove that solvability of these reduced problems may be checked in quadràtic time. Finally, we point out how using ideas of Brown and Tarjan for unbalanced merging, we could solve these reduced problems more efficiently
Resumo:
Two-dimensional aperture synthesis radiometry is the technologyselected for ESA's SMOS mission to provide high resolution L-bandradiometric imagery. The array topology is a Y-shaped structure. Theposition and number of redundant elements to minimise instrumentdegradation in case of element failure(s) are studied.
Resumo:
Background: Kabuki syndrome (KS) is a multiple congenital anomaly syndrome characterized by specific facial features, mild to moderate mental retardation, postnatal growth delay, skeletal abnormalities, and unusual dermatoglyphic patterns with prominent fingertip pads. A 3.5 Mb duplication at 8p23.1-p22 was once reported as a specific alteration in KS but has not been confirmed in other patients. The molecular basis of KS remains unknown. Methods: We have studied 16 Spanish patients with a clinical diagnosis of KS or KS-like to search for genomic imbalances using genome-wide array technologies. All putative rearrangements were confirmed by FISH, microsatellite markers and/or MLPA assays, which also determined whether the imbalance was de novo or inherited. Results: No duplication at 8p23.1-p22 was observed in our patients. We detected complex rearrangements involving 2q in two patients with Kabuki-like features: 1) a de novo inverted duplication of 11 Mb with a 4.5 Mb terminal deletion, and 2) a de novo 7.2 Mb-terminal deletion in a patient with an additional de novo 0.5 Mb interstitial deletion in 16p. Additional copy number variations (CNV), either inherited or reported in normal controls, were identified and interpreted as polymorphic variants. No specific CNV was significantly increased in the KS group. Conclusion: Our results further confirmed that genomic duplications of 8p23 region are not a common cause of KS and failed to detect other recurrent rearrangement causing this disorder. The detection of two patients with 2q37 deletions suggests that there is a phenotypic overlap between the two conditions, and screening this region in the Kabuki-like patients should be considered.
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
INTRODUCTION: Diverse microarray and sequencing technologies have been widely used to characterise the molecular changes in malignant epithelial cells in breast cancers. Such gene expression studies to identify markers and targets in tumour cells are, however, compromised by the cellular heterogeneity of solid breast tumours and by the lack of appropriate counterparts representing normal breast epithelial cells. METHODS: Malignant neoplastic epithelial cells from primary breast cancers and luminal and myoepithelial cells isolated from normal human breast tissue were isolated by immunomagnetic separation methods. Pools of RNA from highly enriched preparations of these cell types were subjected to expression profiling using massively parallel signature sequencing (MPSS) and four different genome wide microarray platforms. Functional related transcripts of the differential tumour epithelial transcriptome were used for gene set enrichment analysis to identify enrichment of luminal and myoepithelial type genes. Clinical pathological validation of a small number of genes was performed on tissue microarrays. RESULTS: MPSS identified 6,553 differentially expressed genes between the pool of normal luminal cells and that of primary tumours substantially enriched for epithelial cells, of which 98% were represented and 60% were confirmed by microarray profiling. Significant expression level changes between these two samples detected only by microarray technology were shown by 4,149 transcripts, resulting in a combined differential tumour epithelial transcriptome of 8,051 genes. Microarray gene signatures identified a comprehensive list of 907 and 955 transcripts whose expression differed between luminal epithelial cells and myoepithelial cells, respectively. Functional annotation and gene set enrichment analysis highlighted a group of genes related to skeletal development that were associated with the myoepithelial/basal cells and upregulated in the tumour sample. One of the most highly overexpressed genes in this category, that encoding periostin, was analysed immunohistochemically on breast cancer tissue microarrays and its expression in neoplastic cells correlated with poor outcome in a cohort of poor prognosis estrogen receptor-positive tumours. CONCLUSION: Using highly enriched cell populations in combination with multiplatform gene expression profiling studies, a comprehensive analysis of molecular changes between the normal and malignant breast tissue was established. This study provides a basis for the identification of novel and potentially important targets for diagnosis, prognosis and therapy in breast cancer.
Resumo:
A bidimensional array based on single-photon avalanche diodes for triggered imaging systems is presented. The diodes are operated in the gated mode of acquisition to reduce the probability to detect noise counts interfering with photon arrival events. In addition, low reverse bias overvoltages are used to lessen the dark count rate. Experimental results demonstrate that the prototype fabricated with a standard HV-CMOS process gets rid of afterpulses and offers a reduced dark count probability by applying the proposed modes of operation. The detector exhibits a dynamic range of 15 bits with short gated"on" periods of 10ns and a reverse bias overvoltage of 1.0V.
Resumo:
Low-cost tin oxide gas sensors are inherently nonspecific. In addition, they have several undesirable characteristics such as slow response, nonlinearities, and long-term drifts. This paper shows that the combination of a gas-sensor array together with self-organizing maps (SOM's) permit success in gas classification problems. The system is able to determine the gas present in an atmosphere with error rates lower than 3%. Correction of the sensor's drift with an adaptive SOM has also been investigated
Resumo:
A new drift compensation method based on Common Principal Component Analysis (CPCA) is proposed. The drift variance in data is found as the principal components computed by CPCA. This method finds components that are common for all gasses in feature space. The method is compared in classification task with respect to the other approaches published where the drift direction is estimated through a Principal Component Analysis (PCA) of a reference gas. The proposed new method ¿ employing no specific reference gas, but information from all gases ¿has shown the same performance as the traditional approach with the best-fitted reference gas. Results are shown with data lasting 7-months including three gases at different concentrations for an array of 17 polymeric sensors.
Resumo:
We consider stochastic partial differential equations with multiplicative noise. We derive an algorithm for the computer simulation of these equations. The algorithm is applied to study domain growth of a model with a conserved order parameter. The numerical results corroborate previous analytical predictions obtained by linear analysis.