17 resultados para STATISTICAL METHODOLOGY

em Archivo Digital para la Docencia y la Investigación - Repositorio Institucional de la Universidad del País Vasco


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Impreso por la Diputación Foral de Álava, D.L. VI-430/99.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we describe the methodology developed for the semiautomatic annotation of EPEC-RolSem, a Basque corpus labeled at predicate level following the PropBank-VerbNet model. The methodology presented is the product of detailed theoretical study of the semantic nature of verbs in Basque and of their similarities and differences with verbs in other languages. As part of the proposed methodology, we are creating a Basque lexicon on the PropBank-VerbNet model that we have named the Basque Verb Index (BVI). Our work thus dovetails the general trend toward building lexicons from tagged corpora that is clear in work conducted for other languages. EPEC-RolSem and BVI are two important resources for the computational semantic processing of Basque; as far as the authors are aware, they are also the first resources of their kind developed for Basque. In addition, each entry in BVI is linked to the corresponding verb-entry in well-known resources like PropBank, VerbNet, WordNet, Levin’s Classification and FrameNet. We have also implemented several automatic processes to aid in creating and annotating the BVI, including processes designed to facilitate the task of manual annotation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hyper-spectral data allows the construction of more robust statistical models to sample the material properties than the standard tri-chromatic color representation. However, because of the large dimensionality and complexity of the hyper-spectral data, the extraction of robust features (image descriptors) is not a trivial issue. Thus, to facilitate efficient feature extraction, decorrelation techniques are commonly applied to reduce the dimensionality of the hyper-spectral data with the aim of generating compact and highly discriminative image descriptors. Current methodologies for data decorrelation such as principal component analysis (PCA), linear discriminant analysis (LDA), wavelet decomposition (WD), or band selection methods require complex and subjective training procedures and in addition the compressed spectral information is not directly related to the physical (spectral) characteristics associated with the analyzed materials. The major objective of this article is to introduce and evaluate a new data decorrelation methodology using an approach that closely emulates the human vision. The proposed data decorrelation scheme has been employed to optimally minimize the amount of redundant information contained in the highly correlated hyper-spectral bands and has been comprehensively evaluated in the context of non-ferrous material classification

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Patients with chronic obstructive pulmonary disease (COPD) often experience exacerbations of the disease that require hospitalization. Current guidelines offer little guidance for identifying patients whose clinical situation is appropriate for admission to the hospital, and properly developed and validated severity scores for COPD exacerbations are lacking. To address these important gaps in clinical care, we created the IRYSS-COPD Appropriateness Study. Methods/Design: The RAND/UCLA Appropriateness Methodology was used to identify appropriate and inappropriate scenarios for hospital admission for patients experiencing COPD exacerbations. These scenarios were then applied to a prospective cohort of patients attending the emergency departments (ED) of 16 participating hospitals. Information was recorded during the time the patient was evaluated in the ED, at the time a decision was made to admit the patient to the hospital or discharge home, and during follow-up after admission or discharge home. While complete data were generally available at the time of ED admission, data were often missing at the time of decision making. Predefined assumptions were used to impute much of the missing data. Discussion: The IRYSS-COPD Appropriateness Study will validate the appropriateness criteria developed by the RAND/UCLA Appropriateness Methodology and thus better delineate the requirements for admission or discharge of patients experiencing exacerbations of COPD. The study will also provide a better understanding of the determinants of outcomes of COPD exacerbations, and evaluate the equity and variability in access and outcomes in these patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The evaluation and comparison of internal cluster validity indices is a critical problem in the clustering area. The methodology used in most of the evaluations assumes that the clustering algorithms work correctly. We propose an alternative methodology that does not make this often false assumption. We compared 7 internal cluster validity indices with both methodologies and concluded that the results obtained with the proposed methodology are more representative of the actual capabilities of the compared indices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[ES]En este proyecto se presenta un estudio sobre la estimación de la longitud efectiva de lluvia derivada de los escaneos de elevación obtenidos por el radar meteorológico de Kapildui, en Álava. Se estudia la altura y la longitud de la lluvia para distintos eventos: para lluvia estratiforme y para lluvia convectiva. Se analizará la variabilidad espacial y temporal para diferentes ángulos de elevación del radar. Finalmente, se presentará una versión del algoritmo implementado para el cálculo de longitudes efectivas de lluvia y se realizará un estudio estadístico de la variabilidad de ésta para diferentes direcciones y con diferentes eventos de lluvia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This doctoral Thesis defines and develops a new methodology for feeder reconfiguration in distribution networks with Distributed Energy Resources (DER). The proposed methodology is based on metaheuristic Ant Colony Optimization (ACO) algorithms. The methodology is called Item Oriented Ant System (IOAS) and the doctoral Thesis also defines three variations of the original methodology, Item Oriented Ant Colony System (IOACS), Item Oriented Max-min Ant System (IOMMAS) y Item Oriented Max-min Ant Colony System (IOACS). All methodologies pursue a twofold objective, to minimize the power losses and maximize DER penetration in distribution networks. The aim of the variations is to find the algorithm that adapts better to the present optimization problem, solving it most efficiently. The main feature of the methodology lies in the fact that the heuristic information and the exploitation information (pheromone) are attached to the item not to the path. Besides, the doctoral Thesis proposes to use feeder reconfiguration in order to increase the distribution network capacity of accepting a major degree of DER. The proposed methodology and its three variations have been tested and verified in two distribution networks well documented in the existing bibliography. These networks have been modeled and used to test all proposed methodologies for different scenarios with various DER penetration degrees.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Familial hypercholesterolemia (FH) is a common autosomal codominant disease with a frequency of 1:500 individuals in its heterozygous form. The genetic basis of FH is most commonly mutations within the LDLR gene. Assessing the pathogenicity of LDLR variants is particularly important to give a patient a definitive diagnosis of FH. Current studies of LDLR activity ex vivo are based on the analysis of I-125-labeled lipoproteins (reference method) or fluorescent-labelled LDL. The main purpose of this study was to compare the effectiveness of these two methods to assess LDLR functionality in order to validate a functional assay to analyse LDLR mutations. LDLR activity of different variants has been studied by flow cytometry using FITC-labelled LDL and compared with studies performed previously with I-125-labeled lipoproteins. Flow cytometry results are in full agreement with the data obtained by the I-125 methodology. Additionally confocal microscopy allowed the assignment of different class mutation to the variants assayed. Use of fluorescence yielded similar results than I-125-labeled lipoproteins concerning LDLR activity determination, and also allows class mutation classification. The use of FITC-labelled LDL is easier in handling and disposal, cheaper than radioactivity and can be routinely performed by any group doing LDLR functional validations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the convergence of a remote iterative learning control system subject to data dropouts. The system is composed by a set of discrete-time multiple input-multiple output linear models, each one with its corresponding actuator device and its sensor. Each actuator applies the input signals vector to its corresponding model at the sampling instants and the sensor measures the output signals vector. The iterative learning law is processed in a controller located far away of the models so the control signals vector has to be transmitted from the controller to the actuators through transmission channels. Such a law uses the measurements of each model to generate the input vector to be applied to its subsequent model so the measurements of the models have to be transmitted from the sensors to the controller. All transmissions are subject to failures which are described as a binary sequence taking value 1 or 0. A compensation dropout technique is used to replace the lost data in the transmission processes. The convergence to zero of the errors between the output signals vector and a reference one is achieved as the number of models tends to infinity.