69 resultados para Connected sum of surfaces


Relevância:

100.00% 100.00%

Publicador:

Resumo:

MELIBEA és un directori i validador de polítiques en favor de l'accés obert a la producció científico-acadèmica. Com a directori, descriu les polítiques institucionals existents relacionades amb l'accés obert (AO) a la producció científica i acadèmica. Com a validador, les sotmet a una anàlisi qualitatiu i quantitatiu basat en el compliment d'un conjunt d'indicadors que reflecteixen les bases en què es fonamenta una política institucional. El validador indica una puntuació i un percentatge de compliment per a cada una de les polítiques analitzades. Això es realitza a partir dels valors assignats a certs indicadors i la seva ponderació en funció de la seva importància relativa. La suma dels valors ponderats de cadascun dels indicadors s'ajusta a una escala percentual i condueix al que hem anomenat "Percentatge validat d'accés obert", el càlcul del qual s'exposa en l'apartat de Metodologia. Els tipus d'institucions que s'analitzen són universitats, centres de recerca, agències finançadores i organitzacions governamentals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emergent molecular measurement methods, such as DNA microarray, qRTPCR, andmany others, offer tremendous promise for the personalized treatment of cancer. Thesetechnologies measure the amount of specific proteins, RNA, DNA or other moleculartargets from tumor specimens with the goal of “fingerprinting” individual cancers. Tumorspecimens are heterogeneous; an individual specimen typically contains unknownamounts of multiple tissues types. Thus, the measured molecular concentrations resultfrom an unknown mixture of tissue types, and must be normalized to account for thecomposition of the mixture.For example, a breast tumor biopsy may contain normal, dysplastic and cancerousepithelial cells, as well as stromal components (fatty and connective tissue) and bloodand lymphatic vessels. Our diagnostic interest focuses solely on the dysplastic andcancerous epithelial cells. The remaining tissue components serve to “contaminate”the signal of interest. The proportion of each of the tissue components changes asa function of patient characteristics (e.g., age), and varies spatially across the tumorregion. Because each of the tissue components produces a different molecular signature,and the amount of each tissue type is specimen dependent, we must estimate the tissuecomposition of the specimen, and adjust the molecular signal for this composition.Using the idea of a chemical mass balance, we consider the total measured concentrationsto be a weighted sum of the individual tissue signatures, where weightsare determined by the relative amounts of the different tissue types. We develop acompositional source apportionment model to estimate the relative amounts of tissuecomponents in a tumor specimen. We then use these estimates to infer the tissuespecificconcentrations of key molecular targets for sub-typing individual tumors. Weanticipate these specific measurements will greatly improve our ability to discriminatebetween different classes of tumors, and allow more precise matching of each patient tothe appropriate treatment

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In most psychological tests and questionnaires, a test score is obtained bytaking the sum of the item scores. In virtually all cases where the test orquestionnaire contains multidimensional forced-choice items, this traditionalscoring method is also applied. We argue that the summation of scores obtained with multidimensional forced-choice items produces uninterpretabletest scores. Therefore, we propose three alternative scoring methods: a weakand a strict rank preserving scoring method, which both allow an ordinalinterpretation of test scores; and a ratio preserving scoring method, whichallows a proportional interpretation of test scores. Each proposed scoringmethod yields an index for each respondent indicating the degree to whichthe response pattern is inconsistent. Analysis of real data showed that withrespect to rank preservation, the weak and strict rank preserving methodresulted in lower inconsistency indices than the traditional scoring method;with respect to ratio preservation, the ratio preserving scoring method resulted in lower inconsistency indices than the traditional scoring method

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that image processing requires a huge amount of computation, mainly at low level processing where the algorithms are dealing with a great number of data-pixel. One of the solutions to estimate motions involves detection of the correspondences between two images. For normalised correlation criteria, previous experiments shown that the result is not altered in presence of nonuniform illumination. Usually, hardware for motion estimation has been limited to simple correlation criteria. The main goal of this paper is to propose a VLSI architecture for motion estimation using a matching criteria more complex than Sum of Absolute Differences (SAD) criteria. Today hardware devices provide many facilities for the integration of more and more complex designs as well as the possibility to easily communicate with general purpose processors

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the zero set of random analytic functions generated by a sum of the cardinal sine functions which form an orthogonal basis for the Paley-Wiener space. As a model case, we consider real-valued Gaussian coefficients. It is shown that the asymptotic probability that there is no zero in a bounded interval decays exponentially as a function of the length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quantitative estimation of Sea Surface Temperatures from fossils assemblages is afundamental issue in palaeoclimatic and paleooceanographic investigations. TheModern Analogue Technique, a widely adopted method based on direct comparison offossil assemblages with modern coretop samples, was revised with the aim ofconforming it to compositional data analysis. The new CODAMAT method wasdeveloped by adopting the Aitchison metric as distance measure. Modern coretopdatasets are characterised by a large amount of zeros. The zero replacement was carriedout by adopting a Bayesian approach to the zero replacement, based on a posteriorestimation of the parameter of the multinomial distribution. The number of modernanalogues from which reconstructing the SST was determined by means of a multipleapproach by considering the Proxies correlation matrix, Standardized Residual Sum ofSquares and Mean Squared Distance. This new CODAMAT method was applied to theplanktonic foraminiferal assemblages of a core recovered in the Tyrrhenian Sea.Kew words: Modern analogues, Aitchison distance, Proxies correlation matrix,Standardized Residual Sum of Squares

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The total energy of molecule in terms of 'fuzzy atoms' presented as sum of one- and two-atomic energy components is described. The divisions of three-dimensional physical space into atomic regions exhibit continuous transition from one to another. The energy components are on chemical energy scale according to proper definitions. The Becke's integration scheme and weight function determines realization of method which permits effective numerical integrations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

GeneID is a program to predict genes in anonymous genomic sequences designed with a hierarchical structure. In the first step, splice sites, and start and stop codons are predicted and scored along the sequence using position weight matrices (PWMs). In the second step, exons are built from the sites. Exons are scored as the sum of the scores of the defining sites, plus the log-likelihood ratio of a Markov model for coding DNA. In the last step, from the set of predicted exons, the gene structure is assembled, maximizing the sum of the scores of the assembled exons. In this paper we describe the obtention of PWMs for sites, and the Markov model of coding DNA in Drosophila melanogaster. We also compare other models of coding DNA with the Markov model. Finally, we present and discuss the results obtained when GeneID is used to predict genes in the Adh region. These results show that the accuracy of GeneID predictions compares currently with that of other existing tools but that GeneID is likely to be more efficient in terms of speed and memory usage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We see that the price of an european call option in a stochastic volatilityframework can be decomposed in the sum of four terms, which identifythe main features of the market that affect to option prices: the expectedfuture volatility, the correlation between the volatility and the noisedriving the stock prices, the market price of volatility risk and thedifference of the expected future volatility at different times. We alsostudy some applications of this decomposition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By means of classical Itô's calculus we decompose option prices asthe sum of the classical Black-Scholes formula with volatility parameterequal to the root-mean-square future average volatility plus a term dueby correlation and a term due to the volatility of the volatility. Thisdecomposition allows us to develop first and second-order approximationformulas for option prices and implied volatilities in the Heston volatilityframework, as well as to study their accuracy. Numerical examples aregiven.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les persones amb dependència necessiten ajuda en la mobilització al llit, que suposa un esforç per a la persona cuidadora. La percepció de dificultat en dur a terme aquestes tasques pot afectar negativament la qualitat de vida de les persones cuidadores i també la de les persones dependents. L’objectiu d’aquest estudi va ser investigar si la utilització de superfícies de reducció de la fricció (tipus llençols lliscants) redueix la dificultat percebuda per les persones cuidadores i les persones dependents en la mobilització al llit. Mètodes: es va dur a terme un estudi quasi experimental a domicili en una mostra seleccionada a conveniència de 12 persones amb dependència i els seus cuidadors en diverses localitats. Es va avaluar la percepció de dificultat en la mobilització per part dels cuidadors i les persones dependents abans i després d’una intervenció formativa simultània a la utilització de llençols lliscants. Resultats: el perfil del cuidador és el d’una dona de mitjana edat amb una relació filial amb la persona dependent i sense formació específica en mètodes de mobilització de persones dependents. El perfil de la persona dependent és el d’una dona major de 80 anys amb una gran dependència que necessita ajuda de la persona que la cuida en les maniobres de redreçament i trasllat al llit. En una escala de percepció de la dificultat de 0 a 10, les mitjanes de les puntuacions observades en els cuidadors abans de la intervenció van ser de 6,9 (DE: 3,1) en el redreçament cap amunt i de 7,1 (DE: 3,1) en el trasllat lateral; després de la intervenció van ser d’1,25 (DE: 1,8) en el redreçament cap amunt i d’1,45 (DE: 1,6) en la mobilització lateral. En la mateixa escala, les mitjanes de les puntuacions observades en les persones dependents abans de la intervenció van ser de 8,6 (DE: 2,3) en el redreçament cap amunt i de 8,6 (DE: 2,3) en el trasllat cap als costats, i després de la intervenció van ser de 2 (DE: 2,6) en els redreçaments cap amunt i de 2 (DE: 2,6) en els trasllats cap als costats. Comparant les puntuacions d’abans i després de la intervenció, observem que la dificultat percebuda per part dels cuidadors de les persones dependents va disminuir de manera significativa (p & 0,001); en les persones dependents també va disminuir, però la disminució no va arribar a ser significativa (p = 0,057). Conclusions: els resultats observats mostren que la utilització de llençols lliscants millora la qualitat de vida dels cuidadors, mesurada per la percepció de la dificultat en les maniobres de mobilització al llit. Així mateix, la seva utilització pot contribuir a un millor envelliment actiu. Caldria fer estudis que permetin quantificar l’esforç físic necessari i el cost-benefici que suposa la utilització d’aquests dispositius.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within a drift-diffusion model we investigate the role of the self-consistent electric field in determining the impedance field of a macroscopic Ohmic (linear) resistor made by a compensated semi-insulating semiconductor at arbitrary values of the applied voltage. The presence of long-range Coulomb correlations is found to be responsible for a reshaping of the spatial profile of the impedance field. This reshaping gives a null contribution to the macroscopic impedance but modifies essentially the transition from thermal to shot noise of a macroscopic linear resistor. Theoretical calculations explain a set of noise experiments carried out in semi-insulating CdZnTe detectors.