948 resultados para Re-ranking methods


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Re-use of unused medicines returned from patients is currently considered unethical in the UK and these are usually destroyed by incineration. Previous studies suggest that many of these medicines may be in a condition suitable for re-use. Methods: All medicines returned over two months to participating community pharmacies and GP surgeries in Eastern Birmingham PCT were assessed for type, quantity and value. A registered pharmacist assessed packs against set criteria to determine the suitability for possible re-use. Results: Nine hundred and thirty-four return events were made from 910 patients, comprising 3765 items worth £33 608. Cardiovascular drugs (1003, 27%) and those acting on the CNS (884, 24%) were most prevalent. Returned packs had a median of 17 months remaining before expiry and one-quarter of packs (1248 out of 4291) were suitable for possible re-use. One-third of those suitable for re-use (476 out of 1248) contained drugs in the latest WHO Essential Drugs List. Conclusion: Unused medicines are returned in substantial quantities and have considerable financial value, with many in a condition suitable for re-use. We consider it appropriate to reopen the debate on the potential for re-using these medicines in developing countries where medicines are not widely available and also within the UK. © The Author 2007, Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The quantitative diatom analysis of 218 surface sediment samples recovered in the Atlantic and western Indian sector of the Southern Ocean is used to define a base of reference data for paleotemperature estimations from diatom assemblages using the Imbrie and Kipp transfer function method. The criteria which justify the exclusion of samples and species out of the raw data set in order to define a reference database are outlined and discussed. Sensitivity tests with eight data sets were achieved evaluating the effects of overall dominance of single species, different methods of species abundance ranking, and no-analog conditions (e.g., Eucampia Antarctica) on the estimated paleotemperatures. The defined transfer functions were applied on a sediment core from the northern Antarctic zone. Overall dominance of Fragilariopsis kerguelensis in the diatom assemblages resulted in a close affinity between paleotemperature curve and relative abundance pattern of this species downcore. Logarithmic conversion of counting data applied with other ranking methods in order to compensate the dominance of F. kerguelensis revealed the best statistical results. A reliable diatom transfer function for future paleotemperature estimations is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Users seeking information may not find relevant information pertaining to their information need in a specific language. But information may be available in a language different from their own, but users may not know that language. Thus users may experience difficulty in accessing the information present in different languages. Since the retrieval process depends on the translation of the user query, there are many issues in getting the right translation of the user query. For a pair of languages chosen by a user, resources, like incomplete dictionary, inaccurate machine translation system may exist. These resources may be insufficient to map the query terms in one language to its equivalent terms in another language. Also for a given query, there might exist multiple correct translations. The underlying corpus evidence may suggest a clue to select a probable set of translations that could eventually perform a better information retrieval. In this paper, we present a cross language information retrieval approach to effectively retrieve information present in a language other than the language of the user query using the corpus driven query suggestion approach. The idea is to utilize the corpus based evidence of one language to improve the retrieval and re-ranking of news documents in the other language. We use FIRE corpora - Tamil and English news collections in our experiments and illustrate the effectiveness of the proposed cross language information retrieval approach.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVE: The objective of this European multicenter study was to report surgical outcomes of Fontan takedown, Fontan conversion and heart transplantation (HTX) for failing Fontan patients in terms of all-cause mortality and (re-)HTX. METHODS: A retrospective international study was conducted by the European Congenital Heart Surgeons Association among 22 member centres. Outcome of surgery to address failing Fontan was collected in 225 patients among which were patients with Fontan takedown (n=38; 17%), Fontan conversion (n=137; 61%) or HTX (n=50; 22%). RESULTS: The most prevalent indication for failing Fontan surgery was arrhythmia (43.6%), but indications differed across the surgical groups (p<0.001). Fontan takedown was mostly performed in the early postoperative phase after Fontan completion, while Fontan conversion and HTX were mainly treatment options for late failure. Early (30 days) mortality was high for Fontan takedown (ie, 26%). Median follow-up was 5.9 years (range 0-23.7 years). The combined end point mortality/HTX was reached in 44.7% of the Fontan takedown patients, in 26.3% of the Fontan conversion patients and in 34.0% of the HTX patients, respectively (log rank p=0.08). Survival analysis showed no difference between Fontan conversion and HTX (p=0.13), but their ventricular function differed significantly. In patients who underwent Fontan conversion or HTX ventricular systolic dysfunction appeared to be the strongest predictor of mortality or (re-)HTX. Patients with valveless atriopulmonary connection (APC) take more advantage of Fontan conversion than patients with a valve-containing APC (p=0.04). CONCLUSIONS: Takedown surgery for failing Fontan is mostly performed in the early postoperative phase, with a high risk of mortality. There is no difference in survival after Fontan conversion or HTX.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thanks to the advanced technologies and social networks that allow the data to be widely shared among the Internet, there is an explosion of pervasive multimedia data, generating high demands of multimedia services and applications in various areas for people to easily access and manage multimedia data. Towards such demands, multimedia big data analysis has become an emerging hot topic in both industry and academia, which ranges from basic infrastructure, management, search, and mining to security, privacy, and applications. Within the scope of this dissertation, a multimedia big data analysis framework is proposed for semantic information management and retrieval with a focus on rare event detection in videos. The proposed framework is able to explore hidden semantic feature groups in multimedia data and incorporate temporal semantics, especially for video event detection. First, a hierarchical semantic data representation is presented to alleviate the semantic gap issue, and the Hidden Coherent Feature Group (HCFG) analysis method is proposed to capture the correlation between features and separate the original feature set into semantic groups, seamlessly integrating multimedia data in multiple modalities. Next, an Importance Factor based Temporal Multiple Correspondence Analysis (i.e., IF-TMCA) approach is presented for effective event detection. Specifically, the HCFG algorithm is integrated with the Hierarchical Information Gain Analysis (HIGA) method to generate the Importance Factor (IF) for producing the initial detection results. Then, the TMCA algorithm is proposed to efficiently incorporate temporal semantics for re-ranking and improving the final performance. At last, a sampling-based ensemble learning mechanism is applied to further accommodate the imbalanced datasets. In addition to the multimedia semantic representation and class imbalance problems, lack of organization is another critical issue for multimedia big data analysis. In this framework, an affinity propagation-based summarization method is also proposed to transform the unorganized data into a better structure with clean and well-organized information. The whole framework has been thoroughly evaluated across multiple domains, such as soccer goal event detection and disaster information management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this study was to evaluate the effects of inclusion or non-inclusion of short lactations and cow (CGG) and/or dam (DGG) genetic group on the genetic evaluation of 305-day milk yield (MY305), age at first calving (AFC), and first calving interval (FCI) of Girolando cows. Covariance components were estimated by the restricted maximum likelihood method in an animal model of single trait analyses. The heritability estimates for MY305, AFC, and FCI ranged from 0.23 to 0.29, 0.40 to 0.44, and 0.13 to 0.14, respectively, when short lactations were not included, and from 0.23 to 0.28, 0.39 to 0.43, and 0.13 to 0.14, respectively, when short lactations were included. The inclusion of short lactations caused little variation in the variance components and heritability estimates of traits, but their non-inclusion resulted in the re-ranking of animals. Models with CGG or DGG fixed effects had higher heritability estimates for all traits compared with models that consider these two effects simultaneously. We recommend using the model with fixed effects of CGG and inclusion of short lactations for the genetic evaluation of Girolando cattle.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This Paper deals with the analysis of liquid limit of soils, an inferential parameter of universal acceptance. It has been undertaken primarily to re-examine one-point methods of determination of liquid limit water contents. It has been shown by basic characteristics of soils and associated physico-chemical factors that critical shear strengths at liquid limit water contents arise out of force field equilibrium and are independent of soil type. This leads to the formation of a scientific base for liquid limit determination by one-point methods, which hitherto was formulated purely on statistical analysis of data. Available methods (Norman, 1959; Karlsson, 1961; Clayton & Jukes, 1978) of one-point liquid limit determination have been critically re-examined. A simple one-point cone penetrometer method of computing liquid limit has been suggested and compared with other methods. Experimental data of Sherwood & Ryley (1970) have been employed for comparison of different cone penetration methods. Results indicate that, apart from mere statistical considerations, one-point methods have a strong scientific base on the uniqueness of modified flow line irrespective of soil type. Normalized flow line is obtained by normalization of water contents by liquid limit values thereby nullifying the effects of surface areas and associated physico-chemical factors that are otherwise reflected in different responses at macrolevel.Cet article traite de l'analyse de la limite de liquidité des sols, paramètre déductif universellement accepté. Cette analyse a été entreprise en premier lieu pour ré-examiner les méthodes à un point destinées à la détermination de la teneur en eau à la limite de liquidité. Il a été démontré par les caractéristiques fondamentales de sols et par des facteurs physico-chimiques associés que les résistances critiques à la rupture au cisaillement pour des teneurs en eau à la limite de liquidité résultent de l'équilibre des champs de forces et sont indépendantes du type de sol concerné. On peut donc constituer une base scientifique pour la détermination de la limite de liquidité par des méthodes à un point lesquelles, jusqu'alors, n'avaient été formulées que sur la base d'une analyse statistique des données. Les méthodes dont on dispose (Norman, 1959; Karlsson, 1961; Clayton & Jukes, 1978) pour la détermination de la limite de liquidité à un point font l'objet d'un ré-examen critique. Une simple méthode d'analyse à un point à l'aide d'un pénétromètre à cône pour le calcul de la limite de liquidité a été suggérée et comparée à d'autres méthodes. Les données expérimentales de Sherwood & Ryley (1970) ont été utilisées en vue de comparer différentes méthodes de pénétration par cône. En plus de considérations d'ordre purement statistque, les résultats montrent que les méthodes de détermination à un point constituent une base scientifique solide en raison du caractère unique de la ligne de courant modifiée, quel que soit le type de sol La ligne de courant normalisée est obtenue par la normalisation de la teneur en eau en faisant appel à des valeurs de limite de liquidité pour, de cette manière, annuler les effets des surfaces et des facteurs physico-chimiques associés qui sans cela se manifesteraient dans les différentes réponses au niveau macro.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

By using metal nitrates and oxides as the starting materials, Y3Al5O12 (YAG) and YAG: RE3+ (RE: Eu, Dy) powder phosphors were prepared by solid state (SS), coprecipitation (CP) and citrate-gel (CG) methods, respectively. The resulting YAG based phosphors were characterized by XRD and photoluminescent excitation and emission spectra as well as lifetimes. The purified crystalline phases of YAG were obtained at 800degreesC (CG) and 900degreesC (CP and SS), respectively. Great differences were observed for the excitation and emission spectra of Eu3+ and Dy3+ between crystalline and amorphous states of YAG, and their emission intensities increased with increasing the annealing temperature. At an identical annealing temperature and doping concentration, the Eu3+ and Dy3+ showed the strongest and weakest emission intensity in CP- and CG-derived YAG phosphors, respectively. The poor emission intensity for CG-derived phosphors is mainly caused by the contamination organic impurities from citric acid in the starting materials. Furthermore, the lifetimes for the samples derived from CG and CP routes are shorter than those derived from the SS route.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the past ten years, a variety of microRNA target prediction methods has been developed, and many of the methods are constantly improved and adapted to recent insights into miRNA-mRNA interactions. In a typical scenario, different methods return different rankings of putative targets, even if the ranking is reduced to selected mRNAs that are related to a specific disease or cell type. For the experimental validation it is then difficult to decide in which order to process the predicted miRNA-mRNA bindings, since each validation is a laborious task and therefore only a limited number of mRNAs can be analysed. We propose a new ranking scheme that combines ranked predictions from several methods and - unlike standard thresholding methods - utilises the concept of Pareto fronts as defined in multi-objective optimisation. In the present study, we attempt a proof of concept by applying the new ranking scheme to hsa-miR-21, hsa-miR-125b, and hsa-miR-373 and prediction scores supplied by PITA and RNAhybrid. The scores are interpreted as a two-objective optimisation problem, and the elements of the Pareto front are ranked by the STarMir score with a subsequent re-calculation of the Pareto front after removal of the top-ranked mRNA from the basic set of prediction scores. The method is evaluated on validated targets of the three miRNA, and the ranking is compared to scores from DIANA-microT and TargetScan. We observed that the new ranking method performs well and consistent, and the first validated targets are elements of Pareto fronts at a relatively early stage of the recurrent procedure. which encourages further research towards a higher-dimensional analysis of Pareto fronts. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The troubling concept of class: reflecting on our ‘failure’ to encourage sociology students to re-cognise their classed locations using autobiographical methods Abstract This paper provides a narrative of the four authors‟ commitment to auto/biographical methods as teachers and researchers in „new‟ universities. As they went about their work, they observed that, whereas students engage with the gendered, sexualised and racialised processes when negotiating their identities, they are reluctant or unable to conceptualise „class-ifying‟ processes as key determinants of their life chances. This general inability puzzled the authors, given the students‟ predominantly working-class backgrounds. Through application of their own stories, the authors explore the sociological significance of this pedagogical „failure‟ to account for the troubling concept of class not only in the classroom but also in contemporary society.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Reengineering and integrated development plat- forms typically do not list search results in a particularly useful order. PageRank is the algorithm prominently used by the Google internet search engine to rank the relative importance of elements in a set of hyperlinked documents. To determine the relevance of objects, classes, attributes, and methods we propose to apply PageRank to software artifacts and their relationship (reference, inheritance, access, and invocation). This paper presents various experiments that demonstrate the usefulness of the ranking algorithm in software (re)engineering.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Los decisores cada vez se enfrentan a problemas más complejos en los que tomar una decisión implica tener que considerar simultáneamente muchos criterios que normalmente son conflictivos entre sí. En la mayoría de los problemas de decisión es necesario considerar criterios económicos, sociales y medioambientales. La Teoría de la Decisión proporciona el marco adecuado para poder ayudar a los decisores a resolver estos problemas de decisión complejos, al permitir considerar conjuntamente la incertidumbre existente sobre las consecuencias de cada alternativa en los diferentes atributos y la imprecisión sobre las preferencias de los decisores. En esta tesis doctoral nos centramos en la imprecisión de las preferencias de los decisores cuando éstas pueden ser representadas mediante una función de utilidad multiatributo aditiva. Por lo tanto, consideramos imprecisión tanto en los pesos como en las funciones de utilidad componentes de cada atributo. Se ha considerado el caso en que la imprecisión puede ser representada por intervalos de valores o bien mediante información ordinal, en lugar de proporcionar valores concretos. En este sentido, hemos propuesto métodos que permiten ordenar las diferentes alternativas basados en los conceptos de intensidad de dominación o intensidad de preferencia, los cuales intentan medir la fuerza con la que cada alternativa es preferida al resto. Para todos los métodos propuestos se ha analizado su comportamiento y se ha comparado con los más relevantes existentes en la literatura científica que pueden ser aplicados para resolver este tipo de problemas. Para ello, se ha realizado un estudio de simulación en el que se han usado dos medidas de eficiencia (hit ratio y coeficiente de correlación de Kendall) para comparar los diferentes métodos. ABSTRACT Decision makers increasingly face complex decision-making problems where they have to simultaneously consider many often conflicting criteria. In most decision-making problems it is necessary to consider economic, social and environmental criteria. Decision making theory provides an adequate framework for helping decision makers to make complex decisions where they can jointly consider the uncertainty about the performance of each alternative for each attribute, and the imprecision of the decision maker's preferences. In this PhD thesis we focus on the imprecision of the decision maker's preferences represented by an additive multiattribute utility function. Therefore, we consider the imprecision of weights, as well as of component utility functions for each attribute. We consider the case in which the imprecision is represented by ranges of values or by ordinal information rather than precise values. In this respect, we propose methods for ranking alternatives based on notions of dominance intensity, also known as preference intensity, which attempt to measure how much more preferred each alternative is to the others. The performance of the propose methods has been analyzed and compared against the leading existing methods that are applicable to this type of problem. For this purpose, we conducted a simulation study using two efficiency measures (hit ratio and Kendall correlation coefficient) to compare the different methods.