837 resultados para Best-Worst Scaling


Relevância:

100.00% 100.00%

Publicador:

Resumo:

For optimal solutions in health care, decision makers inevitably must evaluate trade-offs, which call for multi-attribute valuation methods. Researchers have proposed using best-worst scaling (BWS) methods which seek to extract information from respondents by asking them to identify the best and worst items in each choice set. While a companion paper describes the different types of BWS, application and their advantages and downsides, this contribution expounds their relationships with microeconomic theory, which also have implications for statistical inference. This article devotes to the microeconomic foundations of preference measurement, also addressing issues such as scale invariance and scale heterogeneity. Furthermore the paper discusses the basics of preference measurement using rating, ranking and stated choice data in the light of the findings of the preceding section. Moreover the paper gives an introduction to the use of stated choice data and juxtaposes BWS with the microeconomic foundations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In many environmental valuation applications standard sample sizes for choice modelling surveys are impractical to achieve. One can improve data quality using more in-depth surveys administered to fewer respondents. We report on a study using high quality rank-ordered data elicited with the best-worst approach. The resulting "exploded logit" choice model, estimated on 64 responses per person, was used to study the willingness to pay for external benefits by visitors for policies which maintain the cultural heritage of alpine grazing commons. We find evidence supporting this approach and reasonable estimates of mean WTP, which appear theoretically valid and policy informative. © The Author (2011).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Accurate estimation of mass transport parameters is necessary for overall design and evaluation processes of the waste disposal facilities. The mass transport parameters, such as effective diffusion coefficient, retardation factor and diffusion accessible porosity, are estimated from observed diffusion data by inverse analysis. Recently, particle swarm optimization (PSO) algorithm has been used to develop inverse model for estimating these parameters that alleviated existing limitations in the inverse analysis. However, PSO solver yields different solutions in successive runs because of the stochastic nature of the algorithm and also because of the presence of multiple optimum solutions. Thus the estimated mean solution from independent runs is significantly different from the best solution. In this paper, two variants of the PSO algorithms are proposed to improve the performance of the inverse analysis. The proposed algorithms use perturbation equation for the gbest particle to gain information around gbest region on the search space and catfish particles in alternative iterations to improve exploration capabilities. Performance comparison of developed solvers on synthetic test data for two different diffusion problems reveals that one of the proposed solvers, CPPSO, significantly improves overall performance with improved best, worst and mean fitness values. The developed solver is further used to estimate transport parameters from 12 sets of experimentally observed diffusion data obtained from three diffusion problems and compared with published values from the literature. The proposed solver is quick, simple and robust on different diffusion problems. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Detailed investigations of the effectiveness of three widely adopted optical orthogonal frequency division multiplexing (OOFDM) adaptive loading algorithms, including power loading (PL), bit loading (BL), and bit-and-power loading (BPL), are undertaken, over < 100km single-mode fibre (SMF) system without incorporating inline optical amplification and chromatic dispersion (CD) compensation. It is shown that the BPL (PL) algorithm always offers the best (worst) transmission performance. The absolute transmission capacity differences between these algorithms are independent of transmission distance and launched optical power. Moreover, it is shown that in comparison with the most sophisticated BPL algorithm, the simplest PL algorithm is effective in escalating the OOFDM SMF links performance to its maximum potential. On the other hand, when employing a large number of subcarriers and a high digital-to-analogue DAC)/analogue-to-digital (ADC) sampling rate, the sophisticated BPL algorithm has to be adopted. © 2011 IEEE.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper estimates the marginal willingness-to-pay for attributes of a hypothetical HIV vaccine using discrete choice modeling. We use primary data from 326 respondents from Bangkok and Chiang Mai, Thailand, in 2008–2009, selected using purposive, venue-based sampling across two strata. Participants completed a structured questionnaire and full rank discrete choice modeling task administered using computer-assisted personal interviewing. The choice experiment was used to rank eight hypothetical HIV vaccine scenarios, with each scenario comprising seven attributes (including cost) each of which had two levels. The data were analyzed in two alternative specifications: (1) best-worst; and (2) full-rank, using logit likelihood functions estimated with custom routines in Gauss matrix programming language. In the full-rank specification, all vaccine attributes are significant predictors of probability of vaccine choice. The biomedical attributes of the hypothetical HIV vaccine (efficacy, absence of VISP, absence of side effects, and duration of effect) are the most important attributes for HIV vaccine choice. On average respondents are more than twice as likely to accept a vaccine with 99% efficacy, than a vaccine with 50% efficacy. This translates to a willingness to pay US$383 more for a high efficacy vaccine compared with the low efficacy vaccine. Knowledge of the relative importance of determinants of HIV vaccine acceptability is important to ensure the success of future vaccination programs. Future acceptability studies of hypothetical HIV vaccines should use more finely grained biomedical attributes, and could also improve the external validity of results by including more levels of the cost attribute.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates whether the momentum effect exists in the NYSE energy sector. Momentum is defined as the strategy that buys (sells) these stocks that are best (worst) performers, over a pre-specified past period of time (the 'look-back' period), by constructing equally weighted portfolios. Different momentum strategies are obtained by changing the number of stocks included in these portfolios, as well as the look-back period. Next, their performance is compared against two benchmarks: the equally weighted portfolio consisting of most stocks in the NYSE energy index and the market portfolio, and the S&P500 index. The results indicate that the momentum effect is strongly present in the energy sector, and leads to highly profitable portfolios, improving the risk-reward measures and easily outperforming both benchmarks.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

El minuto final de un partido ajustado de baloncesto es un momento crítico que está sujeto a multitud de factores que influyen en su desarrollo. Así, el porcentaje de acierto en los tiros libres durante ese periodo de tiempo va a determinar, en muchas ocasiones, el resultado final del partido. La disminución de rendimiento (drop) en esta faceta de juego en condiciones de presión, puede estar relacionada con múltiples variables propias del contexto deportivo estudiado, como por ejemplo: los segundos restantes de posesión, la situación en el marcador (ir ganando, empatando o perdiendo), la localización del partido (jugar en casa o fuera), la fase de competición (fase regular o eliminatorias) o el nivel del equipo (mejores/peores equipos). Además, las características del jugador que realiza los lanzamientos tienen una gran importancia respecto a su edad y años de experiencia para afrontar los momentos críticos, así como el puesto de juego que ocupa en el equipo. En este sentido, la combinación de factores del contexto y del jugador, permiten interactuar en el rendimiento del lanzador en los momentos finales de partido durante sus lanzamientos de tiro libre. El presente trabajo de tesis doctoral tiene como objetivo encontrar aquellas variables más relacionadas con la disminución de rendimiento del jugador en los tiros libres durante el último minuto de juego, y la última serie de tiros libres en los partidos ajustados de baloncesto. Para alcanzar el objetivo del estudio se analizaron 124 partidos ajustados (diferencias iguales o inferiores a 2 puntos) de todas las competiciones (fase regular, playoff y copa del Rey) de la liga ACB durante las temporadas 2011-2012 a 2014-2015. Para el registro de variables se analizó el porcentaje de acierto en los tiros libres del lanzador en la liga regular, partido completo, último minuto y última serie. De este modo se trató de analizar qué variables del contexto y del jugador permitían explicar el rendimiento en los tiros libres durante el último minuto, y la última serie de tiros libres del partido. Por otro lado, se trató de conocer el grado de asociación entre el descenso del rendimiento (drop) en los momentos finales de partido, y las variables estudiadas del jugador: puesto de juego, edad, y años de experiencia profesional; mientras que las variables situacionales consideradas fueron: fase de competición, localización, clasificación, tiempo restante, y diferencia parcial en el marcador. Para el análisis de los datos se realizaron dos modelos estadísticos: 1º) un modelo de regresión lineal múltiple para conocer el efecto de las variables independientes en el porcentaje de aciertos del lanzador en el último minuto, y en la última serie de tiros libres del partido; y 2º) un análisis de regresión logística binomial para analizar la relación existente entre la probabilidad de tener un drop (disminución del rendimiento) y las características del lanzador, y las variables situacionales. Los resultados del modelo de regresión lineal múltiple mostraron efectos negativos significativos en el porcentaje de acierto en los tiros libres durante el último minuto, cuando los lanzadores son los pívots (-19,45%). Por otro lado, los resultados durante la última serie mostraron el efecto negativo significativo sobre la posición de pívot (- 19,30%) y la diferencia parcial en el marcador (-3,33%, para cada punto de diferencia en el marcador) en el porcentaje de acierto en los tiros libres. Las variables independientes edad, experiencia profesional, clasificación en la liga regular, fase de competición, localización, y tiempo restante, no revelaron efectos significativos en los modelos de regresión lineal. Los resultados de la regresión logística binomial revelaron que las variables experiencia profesional entre 13 y 18 años (OR = 4,63), jugar de alero (OR = 23,01), y jugar de base (OR = 10,68) están relacionadas con una baja probabilidad de disminuir el rendimiento durante el último minuto del partido; mientras que ir ganando, aumenta esta probabilidad (OR = 0,06). Además, los resultados de la última serie mostraron una menor disminución del rendimiento del jugador cuando tiene entre 13 y 18 años de experiencia (OR = 4,28), y juega de alero (OR = 8,06) o base (OR = 6,34). Por el contrario, las variables situacionales relacionadas con esa disminución del rendimiento del jugador son las fases eliminatorias (OR = 0,22) e ir ganando (OR = 0,04). Los resultados principales del estudio mostraron que existe una disminución del rendimiento del jugador en su porcentaje de acierto en los tiros libres durante el último minuto y en la última serie de lanzamientos del partido, y que está relacionada significativamente con la edad, experiencia profesional, puesto de juego del jugador, y diferencia parcial en el marcador. Encontrando relación también con la fase de competición, durante la última serie de tiros libres del partido. Esta información supone una valiosa información para el entrenador, y su aplicación en el ámbito competitivo real. En este sentido, la creación de simulaciones en el apartado de aplicaciones prácticas, permite predecir el porcentaje de acierto en los tiros libres de un jugador durante los momentos de mayor presión del partido, en base a su perfil de rendimiento. Lo que puede servir para realizar una toma de decisiones más idónea, con el objetivo de lograr el mejor resultado. Del mismo modo, orienta el tipo de proceso de entrenamiento que se ha de seguir, en relación a los jugadores más tendentes al drop, con el objetivo de minimizar el efecto de la presión sobre su capacidad para rendir adecuadamente en la ejecución de los tiros libres, y lograr de esta manera un rendimiento más homogéneo en todos los jugadores del equipo en esta faceta del juego, durante el momento crítico del final de partido. ABSTRACT. The final minute of a close game in basketball is a critical moment which is subject to many factors that influence its development. Thus, the success rate in free-throws during that period will determine, in many cases, the outcome of the game. Decrease of performance (drop) in this facet of play under pressure conditions, may be related to studied own multiple sports context variables, such as the remaining seconds of possession, the situation in the score (to be winning, drawing, or losing) the location of the match (playing at home or away), the competition phase (regular season or playoffs) or team level (best/worst teams). In addition, the characteristics of the player are very important related to his age and years of experience to face the critical moments, as well as his playing position into team. In this sense, the combination of factors in context and player, allows interact about performance of shooter in the final moments of the game during his free-throw shooting. The aim of this present doctoral thesis was find the most related variables to player´s drop in free throws in the last minute of the game and the last row of free-throws in closed games of basketball. To achieve the objective of the study, 124 closed games (less or equal than 2 points difference) were analyzed in every copetition in ACB league (regular season, playoff and cup) from 2011-2012 to 2014-2015 seasons. To record the variables, the percentage of success of the shooter in regular season, full game, last minute, and last row were analyzed. This way, it is tried to analyze which player and context variables explain the free-throw performance in last minute and last row of the game. On the other hand, it is tried to determine the degree of association between decrease of performance (drop) of the player in the final moments, and studied player variables: playing position, age, and years of professional experience; while considered situational variables considered were: competition phase, location, classification, remaining time, and score-line. For data analysis were performed two statistical models: 1) A multiple linear regression model to determine the effect of the independent variables in the succsess percentage of shooter at the last minute, and in the last row of free-throws in the game; and 2) A binomial logistic regression analysis to analyze the relationship between the probability of a drop (lower performance) and the characteristics of the shooter and situational variables. The results of multiple linear regression model showed significant negative effects on the free-throw percentage during last minute, when shooters are centers (-19.45%). On the other hand, results in the last series showed the significant negative effect on the center position (-19.30%) and score-line (-3.33% for each point difference in the score) in the free-throw percentage. The independent variables age, professional experience, ranking in the regular season, competition phase, location, and remaining time, revealed no significant effects on linear regression models. The results of the binomial logistic regression showed that the variables professional experience between 13 and 18 years (OR = 4.63), playing forward (OR = 23.01) and playing guard (OR = 10.68) are related to reduce the probability to decrease the performance during the last minute of the game. While wining, increases it (OR = 0.06). Furthermore, the results of the last row showed a reduction in performance degradation when player is between 13 and 18 years of experience (OR = 4.28), and playing forward (OR = 8.06) or guard (OR = 6.34). By contrast, the variables related to the decrease in performance of the player are the knockout phases (OR = 0.22) and wining (OR = 0.04). The main results of the study showed that there is a decrease in performance of the player in the percentage of success in free-throws in the last minute and last row of the game, and it is significantly associated with age, professional experience, and player position. Finding relationship with the competition phase, during last row of free-throws of the game too. This information is a valuable information for the coach, for applying in real competitive environment. In this sense, create simulations in the section of practical applications allows to predict the success rate of free-throw of a player during the most pressing moments of the game, based on their performance profile. What can be used to take more appropriate decisions in order to achieve the best result. Similarly, guides the type of training process must be followed in relation to the most favorable players to drop, in order to minimize the effect of pressure on their ability to perform properly in the execution of the free-throws. And to achieve, in this way, a more consistent performance in all team players in this facet of the game, during the critical moment in the final of the game.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Software-as-a-Service or SaaS can be delivered in a composite form, consisting of a set of application and data components that work together to deliver higher-level functional software. Components in a composite SaaS may need to be scaled – replicated or deleted, to accommodate the user’s load. It may not be necessary to replicate all components of the SaaS, as some components can be shared by other instances. On the other hand, when the load is low, some of the instances may need to be deleted to avoid resource underutilisation. Thus, it is important to determine which components are to be scaled such that the performance of the SaaS is still maintained. Extensive research on the SaaS resource management in Cloud has not yet addressed the challenges of scaling process for composite SaaS. Therefore, a hybrid genetic algorithm is proposed in which it utilises the problem’s knowledge and explores the best combination of scaling plan for the components. Experimental results demonstrate that the proposed algorithm outperforms existing heuristic-based solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple Clock Domain processors provide an attractive solution to the increasingly challenging problems of clock distribution and power dissipation. They allow their chips to be partitioned into different clock domains, and each domain’s frequency (voltage) to be independently configured. This flexibility adds new dimensions to the Dynamic Voltage and Frequency Scaling problem, while providing better scope for saving energy and meeting performance demands. In this paper, we propose a compiler directed approach for MCD-DVFS. We build a formal petri net based program performance model, parameterized by settings of microarchitectural components and resource configurations, and integrate it with our compiler passes for frequency selection.Our model estimates the performance impact of a frequency setting, unlike the existing best techniques which rely on weaker indicators of domain performance such as queue occupancies(used by online methods) and slack manifestation for a particular frequency setting (software based methods).We evaluate our method with subsets of SPECFP2000,Mediabench and Mibench benchmarks. Our mean energy savings is 60.39% (versus 33.91% of the best software technique)in a memory constrained system for cache miss dominated benchmarks, and we meet the performance demands.Our ED2 improves by 22.11% (versus 18.34%) for other benchmarks. For a CPU with restricted frequency settings, our energy consumption is within 4.69% of the optimal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distributed, low-feedback, timer scheme is used in several wireless systems to select the best node from the available nodes. In it, each node sets a timer as a function of a local preference number called a metric, and transmits a packet when its timer expires. The scheme ensures that the timer of the best node, which has the highest metric, expires first. However, it fails to select the best node if another node transmits a packet within Delta s of the transmission by the best node. We derive the optimal metric-to-timer mappings for the practical scenario where the number of nodes is unknown. We consider two cases in which the probability distribution of the number of nodes is either known a priori or is unknown. In the first case, the optimal mapping maximizes the success probability averaged over the probability distribution. In the second case, a robust mapping maximizes the worst case average success probability over all possible probability distributions on the number of nodes. Results reveal that the proposed mappings deliver significant gains compared to the mappings considered in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main idea of the Load-Unload Response Ratio (LURR) is that when a system is stable, its response to loading corresponds to its response to unloading, whereas when the system is approaching an unstable state, the response to loading and unloading becomes quite different. High LURR values and observations of Accelerating Moment/Energy Release (AMR/AER) prior to large earthquakes have led different research groups to suggest intermediate-term earthquake prediction is possible and imply that the LURR and AMR/AER observations may have a similar physical origin. To study this possibility, we conducted a retrospective examination of several Australian and Chinese earthquakes with magnitudes ranging from 5.0 to 7.9, including Australia's deadly Newcastle earthquake and the devastating Tangshan earthquake. Both LURR values and best-fit power-law time-to-failure functions were computed using data within a range of distances from the epicenter. Like the best-fit power-law fits in AMR/AER, the LURR value was optimal using data within a certain epicentral distance implying a critical region for LURR. Furthermore, LURR critical region size scales with mainshock magnitude and is similar to the AMR/AER critical region size. These results suggest a common physical origin for both the AMR/AER and LURR observations. Further research may provide clues that yield an understanding of this mechanism and help lead to a solid foundation for intermediate-term earthquake prediction.