996 resultados para Combining schemes


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the biomedical studies, the general data structures have been the matched (paired) and unmatched designs. Recently, many researchers are interested in Meta-Analysis to obtain a better understanding from several clinical data of a medical treatment. The hybrid design, which is combined two data structures, may create the fundamental question for statistical methods and the challenges for statistical inferences. The applied methods are depending on the underlying distribution. If the outcomes are normally distributed, we would use the classic paired and two independent sample T-tests on the matched and unmatched cases. If not, we can apply Wilcoxon signed rank and rank sum test on each case. ^ To assess an overall treatment effect on a hybrid design, we can apply the inverse variance weight method used in Meta-Analysis. On the nonparametric case, we can use a test statistic which is combined on two Wilcoxon test statistics. However, these two test statistics are not in same scale. We propose the Hybrid Test Statistic based on the Hodges-Lehmann estimates of the treatment effects, which are medians in the same scale.^ To compare the proposed method, we use the classic meta-analysis T-test statistic on the combined the estimates of the treatment effects from two T-test statistics. Theoretically, the efficiency of two unbiased estimators of a parameter is the ratio of their variances. With the concept of Asymptotic Relative Efficiency (ARE) developed by Pitman, we show ARE of the hybrid test statistic relative to classic meta-analysis T-test statistic using the Hodges-Lemann estimators associated with two test statistics.^ From several simulation studies, we calculate the empirical type I error rate and power of the test statistics. The proposed statistic would provide effective tool to evaluate and understand the treatment effect in various public health studies as well as clinical trials.^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El presente artículo se enmarca en un proyecto de investigación más amplio, realizado en la localidad de Ministro Rivadavia en el Partido de Alte. Brown. La investigación tiene como objetivo evaluar los procesos de movilidad, estancamiento y marginalización social en el período 1994-2008, para una población con elevados índices de pobreza e inserciones laborales precarias e informales. En este estudio nos proponemos reflexionar sobre los desafíos que implicó la articulación entre un cuestionario estructurado y un calendario de historia de vida para el análisis de trayectorias laborales. El trabajo presenta las características de la metodología empleada, discute sus antecedentes teórico-metodológicos y especifica dónde se inscribe la articulación metodológica propuesta en el mapa posible de estrategias de investigación sociológica. A lo largo del artículo, retomamos los debates y propuestas en torno al manejo de la dimensión temporal y presentamos distintos esquemas de análisis centrados en la consideración del tiempo en sus múltiples manifestaciones micro y macrosociales. El artículo contribuye al debate sobre la integración de estrategias de investigación cualitativas y cuantitativas, y evidencia las potencialidades del instrumento propuesto para captar la temporalidad en su carácter plural y multidimensional

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El presente artículo se enmarca en un proyecto de investigación más amplio, realizado en la localidad de Ministro Rivadavia en el Partido de Alte. Brown. La investigación tiene como objetivo evaluar los procesos de movilidad, estancamiento y marginalización social en el período 1994-2008, para una población con elevados índices de pobreza e inserciones laborales precarias e informales. En este estudio nos proponemos reflexionar sobre los desafíos que implicó la articulación entre un cuestionario estructurado y un calendario de historia de vida para el análisis de trayectorias laborales. El trabajo presenta las características de la metodología empleada, discute sus antecedentes teórico-metodológicos y especifica dónde se inscribe la articulación metodológica propuesta en el mapa posible de estrategias de investigación sociológica. A lo largo del artículo, retomamos los debates y propuestas en torno al manejo de la dimensión temporal y presentamos distintos esquemas de análisis centrados en la consideración del tiempo en sus múltiples manifestaciones micro y macrosociales. El artículo contribuye al debate sobre la integración de estrategias de investigación cualitativas y cuantitativas, y evidencia las potencialidades del instrumento propuesto para captar la temporalidad en su carácter plural y multidimensional

Relevância:

20.00% 20.00%

Publicador:

Resumo:

El presente artículo se enmarca en un proyecto de investigación más amplio, realizado en la localidad de Ministro Rivadavia en el Partido de Alte. Brown. La investigación tiene como objetivo evaluar los procesos de movilidad, estancamiento y marginalización social en el período 1994-2008, para una población con elevados índices de pobreza e inserciones laborales precarias e informales. En este estudio nos proponemos reflexionar sobre los desafíos que implicó la articulación entre un cuestionario estructurado y un calendario de historia de vida para el análisis de trayectorias laborales. El trabajo presenta las características de la metodología empleada, discute sus antecedentes teórico-metodológicos y especifica dónde se inscribe la articulación metodológica propuesta en el mapa posible de estrategias de investigación sociológica. A lo largo del artículo, retomamos los debates y propuestas en torno al manejo de la dimensión temporal y presentamos distintos esquemas de análisis centrados en la consideración del tiempo en sus múltiples manifestaciones micro y macrosociales. El artículo contribuye al debate sobre la integración de estrategias de investigación cualitativas y cuantitativas, y evidencia las potencialidades del instrumento propuesto para captar la temporalidad en su carácter plural y multidimensional

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we examine the roles of firm size in the use of FTA schemes in exporting and importing. Also, it is investigated as to whether FTA users in importing (exporting) are more likely to use FTA schemes in exporting (importing). To do that, we employed a unique survey in which the detailed information on FTA use is available for Japanese affiliates in ASEAN. Our findings are summarized as follows. First, firm size matters in the use of FTA schemes only in exporting, not in importing. Second, the past experience of FTA use in exporting (importing) does not help firms use the FTA schemes in importing (exporting). Thus, it is necessary to assist firms to use FTA schemes in exporting even if they are already using FTA schemes in importing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although 3DTV has led the evolution of television market, its delivery by broadcast networks is still small. Now, 3DTV transmis-sions are usually done by combining both views into one common frame (side by side) to be able to use standard HDTV transmission equipment. Today, orthogonal subsampling is mostly used, but other alternatives will appear soon. Here, different subsampling schemes for both progressive and interlaced 3DTV are considered. For each possible scheme, its pre-served frequency content is analyzed and a simple interpolation filter is designed. The analysis is carried out for progressive and interlaced video and the designed filters are applied on different sequences, showing the advantages and disadvantages of the different options

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article proposes a MAS architecture for network diagnosis under uncertainty. Network diagnosis is divided into two inference processes: hypothesis generation and hypothesis confirmation. The first process is distributed among several agents based on a MSBN, while the second one is carried out by agents using semantic reasoning. A diagnosis ontology has been defined in order to combine both inference processes. To drive the deliberation process, dynamic data about the influence of observations are taken during diagnosis process. In order to achieve quick and reliable diagnoses, this influence is used to choose the best action to perform. This approach has been evaluated in a P2P video streaming scenario. Computational and time improvements are highlight as conclusions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses a novel hybrid approach for text categorization that combines a machine learning algorithm, which provides a base model trained with a labeled corpus, with a rule-based expert system, which is used to improve the results provided by the previous classifier, by filtering false positives and dealing with false negatives. The main advantage is that the system can be easily fine-tuned by adding specific rules for those noisy or conflicting categories that have not been successfully trained. We also describe an implementation based on k-Nearest Neighbor and a simple rule language to express lists of positive, negative and relevant (multiword) terms appearing in the input text. The system is evaluated in several scenarios, including the popular Reuters-21578 news corpus for comparison to other approaches, and categorization using IPTC metadata, EUROVOC thesaurus and others. Results show that this approach achieves a precision that is comparable to top ranked methods, with the added value that it does not require a demanding human expert workload to train

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ontologies and taxonomies are widely used to organize concepts providing the basis for activities such as indexing, and as background knowledge for NLP tasks. As such, translation of these resources would prove useful to adapt these systems to new languages. However, we show that the nature of these resources is significantly different from the "free-text" paradigm used to train most statistical machine translation systems. In particular, we see significant differences in the linguistic nature of these resources and such resources have rich additional semantics. We demonstrate that as a result of these linguistic differences, standard SMT methods, in particular evaluation metrics, can produce poor performance. We then look to the task of leveraging these semantics for translation, which we approach in three ways: by adapting the translation system to the domain of the resource; by examining if semantics can help to predict the syntactic structure used in translation; and by evaluating if we can use existing translated taxonomies to disambiguate translations. We present some early results from these experiments, which shed light on the degree of success we may have with each approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article considers static analysis based on abstract interpretation of logic programs over combined domains. It is known that analyses over combined domains provide more information potentially than obtained by the independent analyses. However, the construction of a combined analysis often requires redefining the basic operations for the combined domain. A practical approach to maintain precision in combined analyses of logic programs which reuses the individual analyses and does not redefine the basic operations is illustrated. The advantages of the approach are that proofs of correctness for the new domains are not required and implementations can be reused. The approach is demonstrated by showing that a combined sharing analysis — constructed from "old" proposals — compares well with other "new" proposals suggested in recent literature both from the point of view of efficiency and accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

All-terrain robot locomotion is an active topic of research. Search and rescue maneuvers and exploratory missions could benefit from robots with the abilities of real animals. However, technological barriers exist to ultimately achieving the actuation system, which is able to meet the exigent requirements of these robots. This paper describes the locomotioncontrol of a leg prototype, designed and developed to make a quadruped walk dynamically while exhibiting compliant interaction with the environment. The actuation system of the leg is based on the hybrid use of series elasticity and magneto-rheological dampers, which provide variable compliance for natural-looking motion and improved interaction with the ground. The locomotioncontrol architecture has been proposed to exploit natural leg dynamics in order to improve energy efficiency. Results show that the controller achieves a significant reduction in energy consumption during the leg swing phase thanks to the exploitation of inherent leg dynamics. Added to this, experiments with the real leg prototype show that the combined use of series elasticity and magneto-rheologicaldamping at the knee provide a 20 % reduction in the energy wasted in braking the knee during its extension in the leg stance phase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effective static analyses have been proposed which infer bounds on the number of resolutions. These have the advantage of being independent from the platform on which the programs are executed and have been shown to be useful in a number of applications, such as granularity control in parallel execution. On the other hand, in distributed computation scenarios where platforms with different capabilities come into play, it is necessary to express costs in metrics that include the characteristics of the platform. In particular, it is specially interesting to be able to infer upper and lower bounds on actual execution times. With this objective in mind, we propose an approach which combines compile-time analysis for cost bounds with a one-time profiling of a given platform in order to determine the valúes of certain parameters for that platform. These parameters calibrate a cost model which, from then on, is able to compute statically time bound functions for procedures and to predict with a significant degree of accuracy the execution times of such procedures in that concrete platform. The approach has been implemented and integrated in the CiaoPP system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nondeterminism and partially instantiated data structures give logic programming expressive power beyond that of functional programming. However, functional programming often provides convenient syntactic features, such as having a designated implicit output argument, which allow function cali nesting and sometimes results in more compact code. Functional programming also sometimes allows a more direct encoding of lazy evaluation, with its ability to deal with infinite data structures. We present a syntactic functional extensión, used in the Ciao system, which can be implemented in ISO-standard Prolog systems and covers function application, predefined evaluable functors, functional definitions, quoting, and lazy evaluation. The extensión is also composable with higher-order features and can be combined with other extensions to ISO-Prolog such as constraints. We also highlight the features of the Ciao system which help implementation and present some data on the overhead of using lazy evaluation with respect to eager evaluation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OGOLOD is a Linked Open Data dataset derived from different biomedical resources by an automated pipeline, using a tailored ontology as a scaffold. The key contribution of OGOLOD is that it links, in new RDF triples, genetic human diseases and orthologous genes, paving the way for a more efficient translational biomedical research exploiting the Linked Open Data cloud.