944 resultados para selection methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: Multislice computed tomography (MSCT) is a promising noninvasive method of detecting coronary artery disease (CAD). However, most data have been obtained in selected series of patients. The purpose of the present study was to investigate the accuracy of 64-slice MSCT (64 MSCT) in daily practice, without any patient selection. METHODS AND RESULTS: Using 64-slice MSCT coronary angiography (CTA), 69 consecutive patients, 39 (57%) of whom had previously undergone stent implantation, were evaluated. The mean heart rate during scan was 72 beats/min, scan time 13.6 s and the amount of contrast media 72 mL. The mean time span between invasive coronary angiography (ICAG) and CTA was 6 days. Significant stenosis was defined as a diameter reduction of > 50%. Of 966 segments, 884 (92%) were assessable. Compared with ICAG, the sensitivity of CTA to diagnose significant stenosis was 90%, specificity 94%, positive predictive value (PPV) 89% and negative predictive value (NPV) 95%. With regard to 58 stented lesions, the sensitivity, specificity, PPV and NPV were 93%, 96%, 87% and 98%, respectively. On the patient-based analysis, the sensitivity, specificity, PPV and NPV of CTA to detect CAD were 98%, 86%, 98% and 86%, respectively. Eighty-two (8%) segments were not assessable because of irregular rhythm, calcification or tachycardia. CONCLUSION: Sixty-four-MSCT has a high accuracy for the detection of significant CAD in an unselected patient population and therefore can be considered as a valuable noninvasive technique.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Three-dimensional flow visualization plays an essential role in many areas of science and engineering, such as aero- and hydro-dynamical systems which dominate various physical and natural phenomena. For popular methods such as the streamline visualization to be effective, they should capture the underlying flow features while facilitating user observation and understanding of the flow field in a clear manner. My research mainly focuses on the analysis and visualization of flow fields using various techniques, e.g. information-theoretic techniques and graph-based representations. Since the streamline visualization is a popular technique in flow field visualization, how to select good streamlines to capture flow patterns and how to pick good viewpoints to observe flow fields become critical. We treat streamline selection and viewpoint selection as symmetric problems and solve them simultaneously using the dual information channel [81]. To the best of my knowledge, this is the first attempt in flow visualization to combine these two selection problems in a unified approach. This work selects streamline in a view-independent manner and the selected streamlines will not change for all viewpoints. My another work [56] uses an information-theoretic approach to evaluate the importance of each streamline under various sample viewpoints and presents a solution for view-dependent streamline selection that guarantees coherent streamline update when the view changes gradually. When projecting 3D streamlines to 2D images for viewing, occlusion and clutter become inevitable. To address this challenge, we design FlowGraph [57, 58], a novel compound graph representation that organizes field line clusters and spatiotemporal regions hierarchically for occlusion-free and controllable visual exploration. We enable observation and exploration of the relationships among field line clusters, spatiotemporal regions and their interconnection in the transformed space. Most viewpoint selection methods only consider the external viewpoints outside of the flow field. This will not convey a clear observation when the flow field is clutter on the boundary side. Therefore, we propose a new way to explore flow fields by selecting several internal viewpoints around the flow features inside of the flow field and then generating a B-Spline curve path traversing these viewpoints to provide users with closeup views of the flow field for detailed observation of hidden or occluded internal flow features [54]. This work is also extended to deal with unsteady flow fields. Besides flow field visualization, some other topics relevant to visualization also attract my attention. In iGraph [31], we leverage a distributed system along with a tiled display wall to provide users with high-resolution visual analytics of big image and text collections in real time. Developing pedagogical visualization tools forms my other research focus. Since most cryptography algorithms use sophisticated mathematics, it is difficult for beginners to understand both what the algorithm does and how the algorithm does that. Therefore, we develop a set of visualization tools to provide users with an intuitive way to learn and understand these algorithms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this thesis, I study skin lesion detection and its applications to skin cancer diagnosis. A skin lesion detection algorithm is proposed. The proposed algorithm is based color information and threshold. For the proposed algorithm, several color spaces are studied and the detection results are compared. Experimental results show that YUV color space can achieve the best performance. Besides, I develop a distance histogram based threshold selection method and the method is proven to be better than other adaptive threshold selection methods for color detection. Besides the detection algorithms, I also investigate GPU speed-up techniques for skin lesion extraction and the results show that GPU has potential applications in speeding-up skin lesion extraction. Based on the skin lesion detection algorithms proposed, I developed a mobile-based skin cancer diagnosis application. In this application, the user with an iPhone installed with the proposed application can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modified nucleoside triphosphates (dA(Hs)TP, dU(POH)TP, and dC(Val)TP) bearing imidazole, hydroxyl, and carboxylic acid residues connected to the purine and pyrimidine bases through alkyne linkers were prepared. These modified dN*TPs were excellent substrates for various DNA polymerases in primer extension reactions. Moreover, the combined use of terminal deoxynucleotidyl transferase (TdT) and the modified dNTPs led to efficient tailing reactions that rival those of natural counterparts. Finally, the triphosphates were tolerated by polymerases under PCR conditions, and the ensuing modified oligonucleotides served as templates for the regeneration of unmodified DNA. Thus, these modified dN*TPs are fully compatible with in vitro selection methods and can be used to develop artificial peptidases based on DNA.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Deoxyribozymes or DNAzymes are single-stranded catalytic DNA molecules that are obtained by combinatorial in vitro selection methods. Initially conceived to function as gene silencing agents, the scope of DNAzymes has rapidly expanded into diverse fields, including biosensing, diagnostics, logic gate operations, and the development of novel synthetic and biological tools. In this review, an overview of all the different chemical reactions catalyzed by DNAzymes is given with an emphasis on RNA cleavage and the use of non-nucleosidic substrates. The use of modified nucleoside triphosphates (dN*TPs) to expand the chemical space to be explored in selection experiments and ultimately to generate DNAzymes with an expanded chemical repertoire is also highlighted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Electricity markets in the United States presently employ an auction mechanism to determine the dispatch of power generation units. In this market design, generators submit bid prices to a regulation agency for review, and the regulator conducts an auction selection in such a way that satisfies electricity demand. Most regulators currently use an auction selection method that minimizes total offer costs ["bid cost minimization" (BCM)] to determine electric dispatch. However, recent literature has shown that this method may not minimize consumer payments, and it has been shown that an alternative selection method that directly minimizes total consumer payments ["payment cost minimization" (PCM)] may benefit social welfare in the long term. The objective of this project is to further investigate the long term benefit of PCM implementation and determine whether it can provide lower costs to consumers. The two auction selection methods are expressed as linear constraint programs and are implemented in an optimization software package. Methodology for game theoretic bidding simulation is developed using EMCAS, a real-time market simulator. Results of a 30-day simulation showed that PCM reduced energy costs for consumers by 12%. However, this result will be cross-checked in the future with two other methods of bid simulation as proposed in this paper.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In population studies, most current methods focus on identifying one outcome-related SNP at a time by testing for differences of genotype frequencies between disease and healthy groups or among different population groups. However, testing a great number of SNPs simultaneously has a problem of multiple testing and will give false-positive results. Although, this problem can be effectively dealt with through several approaches such as Bonferroni correction, permutation testing and false discovery rates, patterns of the joint effects by several genes, each with weak effect, might not be able to be determined. With the availability of high-throughput genotyping technology, searching for multiple scattered SNPs over the whole genome and modeling their joint effect on the target variable has become possible. Exhaustive search of all SNP subsets is computationally infeasible for millions of SNPs in a genome-wide study. Several effective feature selection methods combined with classification functions have been proposed to search for an optimal SNP subset among big data sets where the number of feature SNPs far exceeds the number of observations. ^ In this study, we take two steps to achieve the goal. First we selected 1000 SNPs through an effective filter method and then we performed a feature selection wrapped around a classifier to identify an optimal SNP subset for predicting disease. And also we developed a novel classification method-sequential information bottleneck method wrapped inside different search algorithms to identify an optimal subset of SNPs for classifying the outcome variable. This new method was compared with the classical linear discriminant analysis in terms of classification performance. Finally, we performed chi-square test to look at the relationship between each SNP and disease from another point of view. ^ In general, our results show that filtering features using harmononic mean of sensitivity and specificity(HMSS) through linear discriminant analysis (LDA) is better than using LDA training accuracy or mutual information in our study. Our results also demonstrate that exhaustive search of a small subset with one SNP, two SNPs or 3 SNP subset based on best 100 composite 2-SNPs can find an optimal subset and further inclusion of more SNPs through heuristic algorithm doesn't always increase the performance of SNP subsets. Although sequential forward floating selection can be applied to prevent from the nesting effect of forward selection, it does not always out-perform the latter due to overfitting from observing more complex subset states. ^ Our results also indicate that HMSS as a criterion to evaluate the classification ability of a function can be used in imbalanced data without modifying the original dataset as against classification accuracy. Our four studies suggest that Sequential Information Bottleneck(sIB), a new unsupervised technique, can be adopted to predict the outcome and its ability to detect the target status is superior to the traditional LDA in the study. ^ From our results we can see that the best test probability-HMSS for predicting CVD, stroke,CAD and psoriasis through sIB is 0.59406, 0.641815, 0.645315 and 0.678658, respectively. In terms of group prediction accuracy, the highest test accuracy of sIB for diagnosing a normal status among controls can reach 0.708999, 0.863216, 0.639918 and 0.850275 respectively in the four studies if the test accuracy among cases is required to be not less than 0.4. On the other hand, the highest test accuracy of sIB for diagnosing a disease among cases can reach 0.748644, 0.789916, 0.705701 and 0.749436 respectively in the four studies if the test accuracy among controls is required to be at least 0.4. ^ A further genome-wide association study through Chi square test shows that there are no significant SNPs detected at the cut-off level 9.09451E-08 in the Framingham heart study of CVD. Study results in WTCCC can only detect two significant SNPs that are associated with CAD. In the genome-wide study of psoriasis most of top 20 SNP markers with impressive classification accuracy are also significantly associated with the disease through chi-square test at the cut-off value 1.11E-07. ^ Although our classification methods can achieve high accuracy in the study, complete descriptions of those classification results(95% confidence interval or statistical test of differences) require more cost-effective methods or efficient computing system, both of which can't be accomplished currently in our genome-wide study. We should also note that the purpose of this study is to identify subsets of SNPs with high prediction ability and those SNPs with good discriminant power are not necessary to be causal markers for the disease.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The development of targeted therapy involve many challenges. Our study will address some of the key issues involved in biomarker identification and clinical trial design. In our study, we propose two biomarker selection methods, and then apply them in two different clinical trial designs for targeted therapy development. In particular, we propose a Bayesian two-step lasso procedure for biomarker selection in the proportional hazards model in Chapter 2. In the first step of this strategy, we use the Bayesian group lasso to identify the important marker groups, wherein each group contains the main effect of a single marker and its interactions with treatments. In the second step, we zoom in to select each individual marker and the interactions between markers and treatments in order to identify prognostic or predictive markers using the Bayesian adaptive lasso. In Chapter 3, we propose a Bayesian two-stage adaptive design for targeted therapy development while implementing the variable selection method given in Chapter 2. In Chapter 4, we proposed an alternate frequentist adaptive randomization strategy for situations where a large number of biomarkers need to be incorporated in the study design. We also propose a new adaptive randomization rule, which takes into account the variations associated with the point estimates of survival times. In all of our designs, we seek to identify the key markers that are either prognostic or predictive with respect to treatment. We are going to use extensive simulation to evaluate the operating characteristics of our methods.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El presente trabajo de tesis doctoral tiene por finalidad demostrar que las decisiones de financiación y de inversión inmobiliaria en España influyen de manera determinante en la configuración del proceso urbanístico y arquitectónico. Este planteamiento inicial obliga a formular las siguientes cuestiones: - En una primera fase, la situación en España del sector inmobiliario y su relación con el sector financiero en el contexto de la actual crisis financiera, iniciada en el año 2008. - Los métodos de análisis y selección de inversiones y su aplicación a los proyectos de inversión en función de la naturaleza de los activos inmobiliarios. - La valoración inmobiliaria para determinadas entidades financieras. - Características de la gestión financiera empresarial y de las empresas inmobiliarias. - El origen de fondos para financiar el proceso urbanístico y arquitectónico mediante las instituciones de inversión colectiva inmobiliaria y el mercado inmobiliario. - La regulación actual en España, a nivel estatal, del suelo como materia prima del sector inmobiliario. - La edificación residencial y el mercado inmobiliario en el actual contexto económico. - La posibilidad de crear en España un mercado de derivados basado en activos inmobiliarios. - Cómo repercute la actividad económica, a través de la inversión y la financiación, en los activos inmobiliarios, en el urbanismo y la arquitectura. Las cuestiones precedentes se resuelven de forma sistemática y metodológica en esta tesis doctoral estructurada en tres grandes bloques (inversión, financiación y repercusiones en el urbanismo y la arquitectura), obteniendo una serie de respuestas reflejadas en el desarrollo del presente trabajo que se sintetizan en las siguientes líneas: - La actual crisis financiera iniciada en el año 2008 ha provocado en España el colapso del sector inmobiliario y una nueva concepción en la naturaleza de los activos inmobiliarios. El sector inmobiliario trae causa del sector financiero, en especial del crédito bancario. - Dependencia y vinculación del sector inmobiliario español de la política monetaria europea: la incorporación de España a la moneda única transforma por completo el sector inmobiliario español. - Los métodos de análisis y selección de inversiones se conforman como instrumentos relevantes que nos permiten jerarquizar nuestros proyectos. No obstante, presentan una serie de limitaciones y dificultades de aplicación práctica, por lo que no deben considerarse como herramientas que nos aporten una única solución irrefutable. - La valoración de activos inmobiliarios se constituye en un pilar básico que fundamenta la correcta aplicación de los fondos. - La inversión en activos inmobiliarios puede realizarse de forma directa o indirecta. En este último supuesto, con una influencia relevante de las innovaciones financieras surgidas en los últimos años. - Las instituciones de inversión colectiva y el mercado hipotecario constituyen instituciones fundamentales capaces de captar importantes cantidades de fondos que impulsan y financian el proceso urbanístico y arquitectónico. - El complejo y cambiante sistema jurídico español en materia de suelo, dificulta la implementación de los procesos urbanísticos y arquitectónicos. - Tras la crisis financiera de 2008, los activos inmobiliarios tienen un comportamiento similar a otros activos en cuanto a subidas y bajadas de precios. En el actual sistema económico, la especulación forma parte inherente a la naturaleza de los activos inmobiliarios. - Desde una perspectiva teórica, existe la posibilidad de crear un mercado de derivados que tenga como subyacente activos de naturaleza inmobiliaria. - Sin actividad económica, el proceso urbanístico y arquitectónico carecerá finalmente de sentido y tenderá a desaparecer. No obstante, son las innovaciones tecnológicas, a nivel de producto y proceso, las principales causantes del impulso de la actividad económica. - A pesar de lo expresado en los documentos urbanísticos internacionales, la transformación del urbanismo y la arquitectura dependen principalmente de la actividad económica y la tecnología. En un segundo nivel, la inversión y la financiación condicionan y definen el urbanismo y la arquitectura, incluso a nivel de proyecto si se pretende su materialización. En base al desarrollo previo, el objetivo fundamental de esta tesis doctoral ha sido demostrar que las decisiones de financiación y de inversión tienen una importancia capital y determinan la configuración de los activos inmobiliario, del urbanismo y la arquitectura, por lo que deben ser tenidas en cuenta no sólo en su materialización sino incluso en la propia concepción del proceso creativo. ABSTRACT The present dissertation aims to show that real estate financing and investment decisions in Spain play a predominant role in structuring urban development and architectural solutions. The issues addressed to support that contention include: - As a preliminary study, the situation of the real estate industry in Spain and its relationship to the financial sector in the context of the 2008 financial crisis. - The methods used to analyse and select investments and their application to investment projects, by type of real estate asset. - Appraisal of certain financial institutions’ real estate asset holdings. - Characteristics of financial institution and real estate company corporate management. - Sourcing funds for financing urban development and architecture through real estate investment trusts and the real estate market. - Present nation-wide regulations on landed property in Spain as a raw material for the real estate industry. - Residential building construction and the real estate market in the present economic context. - The possibility of creating a real estate asset-based derivatives market in Spain - The impact of economic activity, through investment and financing, on real estate assets, urban development and architecture. The aforementioned issues are addressed systematically and methodically in this dissertation, which is divided into three major units: investment, financing, and impact on urban development and architecture. The conclusions drawn are summarised below. - The financial crisis that began in 2008 has induced the collapse of the Spanish real estate industry and spawned a new perception of the nature of real estate assets. The real estate industry is dependent upon the financial sector, in particular on bank loans. - The Spanish real estate industry also depends on and is related to European monetary policy: Spain’s adherence to the single currency ushered in a thorough overhaul of its real estate industry. - Investment analysis and selection methods constitute highly suitable tools for project evaluation and ranking. Nonetheless, inasmuch as their practical implementation is subject to a series of limitations and difficulties, they should not be thought able to deliver a single irrefutable solution. - Real estate asset appraisal is a mainstay to the rightful application of funds. - Real estate asset investments can be made directly or indirectly. The latter approach is heavily influenced by the financial innovations forthcoming in recent years. - Investment trusts and the mortgage market are key institutions able to raise substantial funding, thereby driving and financing urban development and architecture. - Spain’s complex and changing legal provisions on land management are an obstacle to urban development and architecture. - Since the 2008 crisis, real estate assets have behaved much like other assets in terms of rising and falling prices. In the present economic context, speculation is indivisible from real estate assets. - Theoretically speaking, a derivatives market with real estate holdings as the underlying assets lies within the realm of possibility. - In the absence of economic activity, urban development and architecture are senseless pursuits and tend to disappear. Technological innovation in products and processes are the main drivers of economic activity. - Despite the opinion expressed in international papers on the subject, the transformation of urban development and architecture depend primarily on economic activity and technology. In a second dimension, investment and financing condition and define urban development and architecture, even at the design level for projects aspiring to actual construction. Pursuant to the foregoing, the primary aim of this dissertation is to show that financial and investment decisions are of cardinal importance and determine the structure of real estate assets, urban development and architecture. They must consequently be borne in mind not only in connection with implementation, but also with conceptual design and the creative process itself. I

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-04

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The H I Parkes All Sky Survey (HIPASS) is a blind extragalactic H I 21-cm emission-line survey covering the whole southern sky from declination -90degrees to +25degrees. The HIPASS catalogue (HICAT), containing 4315 H I-selected galaxies from the region south of declination +2degrees, is presented in Meyer et al. (Paper I). This paper describes in detail the completeness and reliability of HICAT, which are calculated from the recovery rate of synthetic sources and follow-up observations, respectively. HICAT is found to be 99 per cent complete at a peak flux of 84 mJy and an integrated flux of 9.4 Jy km. s(-1). The overall reliability is 95 per cent, but rises to 99 per cent for sources with peak fluxes >58 mJy or integrated flux >8.2 Jy km s(-1). Expressions are derived for the uncertainties on the most important HICAT parameters: peak flux, integrated flux, velocity width and recessional velocity. The errors on HICAT parameters are dominated by the noise in the HIPASS data, rather than by the parametrization procedure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Although the aim of conservation planning is the persistence of biodiversity, current methods trade-off ecological realism at a species level in favour of including multiple species and landscape features. For conservation planning to be relevant, the impact of landscape configuration on population processes and the viability of species needs to be considered. We present a novel method for selecting reserve systems that maximize persistence across multiple species, subject to a conservation budget. We use a spatially explicit metapopulation model to estimate extinction risk, a function of the ecology of the species and the amount, quality and configuration of habitat. We compare our new method with more traditional, area-based reserve selection methods, using a ten-species case study, and find that the expected loss of species is reduced 20-fold. Unlike previous methods, we avoid designating arbitrary weightings between reserve size and configuration; rather, our method is based on population processes and is grounded in ecological theory.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present new measurements of the luminosity function (LF) of luminous red galaxies (LRGs) from the Sloan Digital Sky Survey (SDSS) and the 2dF SDSS LRG and Quasar (2SLAQ) survey. We have carefully quantified, and corrected for, uncertainties in the K and evolutionary corrections, differences in the colour selection methods, and the effects of photometric errors, thus ensuring we are studying the same galaxy population in both surveys. Using a limited subset of 6326 SDSS LRGs (with 0.17 < z < 0.24) and 1725 2SLAQ LRGs (with 0.5 < z < 0.6), for which the matching colour selection is most reliable, we find no evidence for any additional evolution in the LRG LF, over this redshift range, beyond that expected from a simple passive evolution model. This lack of additional evolution is quantified using the comoving luminosity density of SDSS and 2SLAQ LRGs, brighter than M-0.2r - 5 log h(0.7) = - 22.5, which are 2.51 +/- 0.03 x 10(-7) L circle dot Mpc(-3) and 2.44 +/- 0.15 x 10(-7) L circle dot Mpc(-3), respectively (< 10 per cent uncertainty). We compare our LFs to the COMBO-17 data and find excellent agreement over the same redshift range. Together, these surveys show no evidence for additional evolution (beyond passive) in the LF of LRGs brighter than M-0.2r - 5 log h(0.7) = - 21 ( or brighter than similar to L-*).. We test our SDSS and 2SLAQ LFs against a simple 'dry merger' model for the evolution of massive red galaxies and find that at least half of the LRGs at z similar or equal to 0.2 must already have been well assembled (with more than half their stellar mass) by z similar or equal to 0.6. This limit is barely consistent with recent results from semi-analytical models of galaxy evolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analyzing geographical patterns by collocating events, objects or their attributes has a long history in surveillance and monitoring, and is particularly applied in environmental contexts, such as ecology or epidemiology. The identification of patterns or structures at some scales can be addressed using spatial statistics, particularly marked point processes methodologies. Classification and regression trees are also related to this goal of finding "patterns" by deducing the hierarchy of influence of variables on a dependent outcome. Such variable selection methods have been applied to spatial data, but, often without explicitly acknowledging the spatial dependence. Many methods routinely used in exploratory point pattern analysis are2nd-order statistics, used in a univariate context, though there is also a wide literature on modelling methods for multivariate point pattern processes. This paper proposes an exploratory approach for multivariate spatial data using higher-order statistics built from co-occurrences of events or marks given by the point processes. A spatial entropy measure, derived from these multinomial distributions of co-occurrences at a given order, constitutes the basis of the proposed exploratory methods. © 2010 Elsevier Ltd.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Analyzing geographical patterns by collocating events, objects or their attributes has a long history in surveillance and monitoring, and is particularly applied in environmental contexts, such as ecology or epidemiology. The identification of patterns or structures at some scales can be addressed using spatial statistics, particularly marked point processes methodologies. Classification and regression trees are also related to this goal of finding "patterns" by deducing the hierarchy of influence of variables on a dependent outcome. Such variable selection methods have been applied to spatial data, but, often without explicitly acknowledging the spatial dependence. Many methods routinely used in exploratory point pattern analysis are2nd-order statistics, used in a univariate context, though there is also a wide literature on modelling methods for multivariate point pattern processes. This paper proposes an exploratory approach for multivariate spatial data using higher-order statistics built from co-occurrences of events or marks given by the point processes. A spatial entropy measure, derived from these multinomial distributions of co-occurrences at a given order, constitutes the basis of the proposed exploratory methods. © 2010 Elsevier Ltd.