6 resultados para User experience based approaches

em Duke University


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ability for the citizens of a nation to determine their own representation has long been regarded as one of the most critical objectives of any electoral system. Without having the assurance of equality in representation, the fundamental nature and operation of the political system is severely undermined. Given the centuries of institutional reforms and population changes in the American system, Congressional Redistricting stands as an institution whereby this promise of effective representation can either be fulfilled or denied. The broad set of processes that encapsulate Congres- sional Redistricting have been discussed, experimented, and modified to achieve clear objectives and have long been understood to be important. Questions remain about how the dynamics which link all of these processes operate and what impact the real- ities of Congressional Redistricting hold for representation in the American system. This dissertation examines three aspects of how Congressional Redistricting in the Untied States operates in accordance with the principle of “One Person, One Vote.” By utilizing data and data analysis techniques of Geographic Information Systems (GIS), this dissertation seeks to address how Congressional Redistricting impacts the principle of one person, one vote from the standpoint of legislator accountability, redistricting institutions, and the promise of effective minority representation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes advances in statistical computation for large-scale data analysis in structured Bayesian mixture models via graphics processing unit (GPU) programming. The developments are partly motivated by computational challenges arising in fitting models of increasing heterogeneity to increasingly large datasets. An example context concerns common biological studies using high-throughput technologies generating many, very large datasets and requiring increasingly high-dimensional mixture models with large numbers of mixture components.We outline important strategies and processes for GPU computation in Bayesian simulation and optimization approaches, give examples of the benefits of GPU implementations in terms of processing speed and scale-up in ability to analyze large datasets, and provide a detailed, tutorial-style exposition that will benefit readers interested in developing GPU-based approaches in other statistical models. Novel, GPU-oriented approaches to modifying existing algorithms software design can lead to vast speed-up and, critically, enable statistical analyses that presently will not be performed due to compute time limitations in traditional computational environments. Supplementalmaterials are provided with all source code, example data, and details that will enable readers to implement and explore the GPU approach in this mixture modeling context. © 2010 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although many feature selection methods for classification have been developed, there is a need to identify genes in high-dimensional data with censored survival outcomes. Traditional methods for gene selection in classification problems have several drawbacks. First, the majority of the gene selection approaches for classification are single-gene based. Second, many of the gene selection procedures are not embedded within the algorithm itself. The technique of random forests has been found to perform well in high-dimensional data settings with survival outcomes. It also has an embedded feature to identify variables of importance. Therefore, it is an ideal candidate for gene selection in high-dimensional data with survival outcomes. In this paper, we develop a novel method based on the random forests to identify a set of prognostic genes. We compare our method with several machine learning methods and various node split criteria using several real data sets. Our method performed well in both simulations and real data analysis.Additionally, we have shown the advantages of our approach over single-gene-based approaches. Our method incorporates multivariate correlations in microarray data for survival outcomes. The described method allows us to better utilize the information available from microarray data with survival outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

© 2015 IEEE.In virtual reality applications, there is an aim to provide real time graphics which run at high refresh rates. However, there are many situations in which this is not possible due to simulation or rendering issues. When running at low frame rates, several aspects of the user experience are affected. For example, each frame is displayed for an extended period of time, causing a high persistence image artifact. The effect of this artifact is that movement may lose continuity, and the image jumps from one frame to another. In this paper, we discuss our initial exploration of the effects of high persistence frames caused by low refresh rates and compare it to high frame rates and to a technique we developed to mitigate the effects of low frame rates. In this technique, the low frame rate simulation images are displayed with low persistence by blanking out the display during the extra time such image would be displayed. In order to isolate the visual effects, we constructed a simulator for low and high persistence displays that does not affect input latency. A controlled user study comparing the three conditions for the tasks of 3D selection and navigation was conducted. Results indicate that the low persistence display technique may not negatively impact user experience or performance as compared to the high persistence case. Directions for future work on the use of low persistence displays for low frame rate situations are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

© 2016, Springer-Verlag Berlin Heidelberg.Nanoparticles are being explored in many different applications due to the unique properties offered by quantum effects. To broaden the scope of these applications, the deposition of nanoparticles onto substrates in a simple and controlled way is highly desired. In this study, we use resonant infrared matrix-assisted pulsed laser evaporation (RIR-MAPLE) for the deposition of metallic, silver nanoparticles for plasmonic applications. We find that RIR-MAPLE, a simple and versatile approach, is able to deposit silver nanoparticles as large as 80 nm onto different substrates with good adhesion, regardless of substrate properties. In addition, the nanoparticle surface coverage of the substrates, which result from the random distribution of nanoparticles across the substrate per laser pulse, can be simply and precisely controlled by RIR-MAPLE. Polymer films of poly(3-hexylthiophene-2,5-diyl) (P3HT) are also deposited by RIR-MAPLE on top of the deposited silver nanoparticles in order to demonstrate enhanced absorption due to the localized surface plasmon resonance effect. The reported features of RIR-MAPLE nanoparticle deposition indicate that this tool can enable efficient processing of nanoparticle thin films for applications that require specific substrates or configurations that are not easily achieved using solution-based approaches.