977 resultados para Binary choice models
Resumo:
Several methods have been suggested to estimate non-linear models with interaction terms in the presence of measurement error. Structural equation models eliminate measurement error bias, but require large samples. Ordinary least squares regression on summated scales, regression on factor scores and partial least squares are appropriate for small samples but do not correct measurement error bias. Two stage least squares regression does correct measurement error bias but the results strongly depend on the instrumental variable choice. This article discusses the old disattenuated regression method as an alternative for correcting measurement error in small samples. The method is extended to the case of interaction terms and is illustrated on a model that examines the interaction effect of innovation and style of use of budgets on business performance. Alternative reliability estimates that can be used to disattenuate the estimates are discussed. A comparison is made with the alternative methods. Methods that do not correct for measurement error bias perform very similarly and considerably worse than disattenuated regression
Resumo:
Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.
Resumo:
Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers' preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.
Resumo:
Female mate choice influences the maintenance of genetic variation by altering the mating success of males with different genotypes. The evolution of preferences themselves, on the other hand, depends on genetic variation present in the population. Few models have tracked this feedback between a choice gene and its effects on genetic variation, in particular when genes that determine offspring viability and attractiveness have dominance effects. Here we build a population genetic model that allows comparing the evolution of various choice rules in a single framework. We first consider preferences for good genes and show that focused preferences for homozygotes evolve more easily than broad preferences, which allow heterozygous males high mating success too. This occurs despite better maintenance of genetic diversity in the latter scenario, and we discuss why empirical findings of superior mating success of heterozygous males consequently do not immediately lead to a better understanding of the lek paradox. Our results thus suggest that the mechanisms that help maintain genetic diversity also have a flipside of making female choice an inaccurate means of producing the desired kind of offspring. We then consider preferences for heterozygosity per se, and show that these evolve only under very special conditions. Choice for compatible genotypes can evolve but its selective advantage diminishes quickly due to frequency-dependent selection. Finally, we show that our model reproduces earlier results on selfing, when the female choice strategy produces assortative mating. Overall, our model indicates that various forms of heterozygote-favouring (or variable) female choice pose a problem for the theory of sexual ornamentation based on indirect benefits, rather than a solution.
Resumo:
OBJECTIVE To assess Spanish and Portuguese patients' and physicians' preferences regarding type 2 diabetes mellitus (T2DM) treatments and the monthly willingness to pay (WTP) to gain benefits or avoid side effects. METHODS An observational, multicenter, exploratory study focused on routine clinical practice in Spain and Portugal. Physicians were recruited from multiple hospitals and outpatient clinics, while patients were recruited from eleven centers operating in the public health care system in different autonomous communities in Spain and Portugal. Preferences were measured via a discrete choice experiment by rating multiple T2DM medication attributes. Data were analyzed using the conditional logit model. RESULTS Three-hundred and thirty (n=330) patients (49.7% female; mean age 62.4 [SD: 10.3] years, mean T2DM duration 13.9 [8.2] years, mean body mass index 32.5 [6.8] kg/m(2), 41.8% received oral + injected medication, 40.3% received oral, and 17.6% injected treatments) and 221 physicians from Spain and Portugal (62% female; mean age 41.9 [SD: 10.5] years, 33.5% endocrinologists, 66.5% primary-care doctors) participated. Patients valued avoiding a gain in bodyweight of 3 kg/6 months (WTP: €68.14 [95% confidence interval: 54.55-85.08]) the most, followed by avoiding one hypoglycemic event/month (WTP: €54.80 [23.29-82.26]). Physicians valued avoiding one hypoglycemia/week (WTP: €287.18 [95% confidence interval: 160.31-1,387.21]) the most, followed by avoiding a 3 kg/6 months gain in bodyweight and decreasing cardiovascular risk (WTP: €166.87 [88.63-843.09] and €154.30 [98.13-434.19], respectively). Physicians and patients were willing to pay €125.92 (73.30-622.75) and €24.28 (18.41-30.31), respectively, to avoid a 1% increase in glycated hemoglobin, and €143.30 (73.39-543.62) and €42.74 (23.89-61.77) to avoid nausea. CONCLUSION Both patients and physicians in Spain and Portugal are willing to pay for the health benefits associated with improved diabetes treatment, the most important being to avoid hypoglycemia and gaining weight. Decreased cardiovascular risk and weight reduction became the third most valued attributes for physicians and patients, respectively.
Information overload, choice deferral, and moderating role of need for cognition: Empirical evidence
Resumo:
ABSTRACT Choice deferral due to information overload is an undesirable result of competitive environments. The neoclassical maximization models predict that choice avoidance will not increase as more information is offered to consumers. The theories developed in the consumer behavior field predict that some properties of the environment may lead to behavioral effects and an increase in choice avoidance due to information overload. Based on stimuli generated experimentally and tested among 1,000 consumers, this empirical research provides evidence for the presence of behavioral effects due to information overload and reveals the different effects of increasing the number of options or the number of attributes. This study also finds that the need for cognition moderates these behavioral effects, and it proposes psychological processes that may trigger the effects observed.
Resumo:
Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services
Resumo:
Models incorporating more realistic models of customer behavior, as customers choosing froman offer set, have recently become popular in assortment optimization and revenue management.The dynamic program for these models is intractable and approximated by a deterministiclinear program called the CDLP which has an exponential number of columns. However, whenthe segment consideration sets overlap, the CDLP is difficult to solve. Column generationhas been proposed but finding an entering column has been shown to be NP-hard. In thispaper we propose a new approach called SDCP to solving CDLP based on segments and theirconsideration sets. SDCP is a relaxation of CDLP and hence forms a looser upper bound onthe dynamic program but coincides with CDLP for the case of non-overlapping segments. Ifthe number of elements in a consideration set for a segment is not very large (SDCP) can beapplied to any discrete-choice model of consumer behavior. We tighten the SDCP bound by(i) simulations, called the randomized concave programming (RCP) method, and (ii) by addingcuts to a recent compact formulation of the problem for a latent multinomial-choice model ofdemand (SBLP+). This latter approach turns out to be very effective, essentially obtainingCDLP value, and excellent revenue performance in simulations, even for overlapping segments.By formulating the problem as a separation problem, we give insight into why CDLP is easyfor the MNL with non-overlapping considerations sets and why generalizations of MNL posedifficulties. We perform numerical simulations to determine the revenue performance of all themethods on reference data sets in the literature.
Resumo:
Customer choice behavior, such as 'buy-up' and 'buy-down', is an importantphe-nomenon in a wide range of industries. Yet there are few models ormethodologies available to exploit this phenomenon within yield managementsystems. We make some progress on filling this void. Specifically, wedevelop a model of yield management in which the buyers' behavior ismodeled explicitly using a multi-nomial logit model of demand. Thecontrol problem is to decide which subset of fare classes to offer ateach point in time. The set of open fare classes then affects the purchaseprobabilities for each class. We formulate a dynamic program todetermine the optimal control policy and show that it reduces to a dynamicnested allocation policy. Thus, the optimal choice-based policy caneasily be implemented in reservation systems that use nested allocationcontrols. We also develop an estimation procedure for our model based onthe expectation-maximization (EM) method that jointly estimates arrivalrates and choice model parameters when no-purchase outcomes areunobservable. Numerical results show that this combined optimization-estimation approach may significantly improve revenue performancerelative to traditional leg-based models that do not account for choicebehavior.
Resumo:
In a world with two countries which differ in size, we study theimpact of (the speed of) trade liberalization on firms' profitsand total welfare of the countries involved. Firms correctlyanticipate the pace of trade liberalization and take it intoaccount when deciding on their product choices, which areendogenously determined at the beginning of the game. Competitionin the marketplace then occurs either on quantities or on prices.As long as the autarkic phase continues, local firms are nationalmonopolists. When trade liberalization occurs, firms compete in aninternational duopoly. We analyze trade effects by using twodifferent models of product differentiation. Across all thespecifications adopted (and independently of the price v. quantitycompetition hypothesis), total welfare always unambiguously riseswith the speed of trade liberalization: Possible losses by firmsare always outweighed by consumers' gains, which come under theform of lower prices, enlarged variety of higher average qualitiesavailable. The effect on profits depends on the type of industryanalyzed. Two results in particular seem to be worth of mention.With vertical product differentiation and fixed costs of qualityimprovements, the expected size of the market faced by the firmsdetermines the incentive to invest in quality. The longer the periodof autarky, the lower the possibility that the firm from the smallcountry would be producing the high quality and be the leader in theinternational market when it opens. On the contrary, when trade opensimmediately, national markets do not play any role and firms fromdifferent countries have the same opportunity to become the leader.Hence, immediate trade liberalization might be in the interest ofproducers in the small country. In general, the lower the size of thesmall country, the more likely its firm will gain from tradeliberalization. Losses from the small country firm can arise when itis relegated to low quality good production and the domestic marketsize is not very small. With horizontal product differentiation (thehomogeneous good case being a limit case of it when costs ofdifferentiation tend to infinity), investments in differentiationbenefit both firms in equal manner. Firms from the small country do notrun the risk of being relegated to a lower competitive position undertrade. As a result, they would never lose from it. Instead, firms fromthe large country may still incur losses from the opening of trade whenthe market expansion effect is low (i.e. when the country is very largerelative to the other).
Resumo:
BACKGROUND: The criteria for choosing relevant cell lines among a vast panel of available intestinal-derived lines exhibiting a wide range of functional properties are still ill-defined. The objective of this study was, therefore, to establish objective criteria for choosing relevant cell lines to assess their appropriateness as tumor models as well as for drug absorption studies. RESULTS: We made use of publicly available expression signatures and cell based functional assays to delineate differences between various intestinal colon carcinoma cell lines and normal intestinal epithelium. We have compared a panel of intestinal cell lines with patient-derived normal and tumor epithelium and classified them according to traits relating to oncogenic pathway activity, epithelial-mesenchymal transition (EMT) and stemness, migratory properties, proliferative activity, transporter expression profiles and chemosensitivity. For example, SW480 represent an EMT-high, migratory phenotype and scored highest in terms of signatures associated to worse overall survival and higher risk of recurrence based on patient derived databases. On the other hand, differentiated HT29 and T84 cells showed gene expression patterns closest to tumor bulk derived cells. Regarding drug absorption, we confirmed that differentiated Caco-2 cells are the model of choice for active uptake studies in the small intestine. Regarding chemosensitivity we were unable to confirm a recently proposed association of chemo-resistance with EMT traits. However, a novel signature was identified through mining of NCI60 GI50 values that allowed to rank the panel of intestinal cell lines according to their drug responsiveness to commonly used chemotherapeutics. CONCLUSIONS: This study presents a straightforward strategy to exploit publicly available gene expression data to guide the choice of cell-based models. While this approach does not overcome the major limitations of such models, introducing a rank order of selected features may allow selecting model cell lines that are more adapted and pertinent to the addressed biological question.
Resumo:
Using Monte Carlo simulations we study the dynamics of three-dimensional Ising models with nearest-, next-nearest-, and four-spin (plaquette) interactions. During coarsening, such models develop growing energy barriers, which leads to very slow dynamics at low temperature. As already reported, the model with only the plaquette interaction exhibits some of the features characteristic of ordinary glasses: strong metastability of the supercooled liquid, a weak increase of the characteristic length under cooling, stretched-exponential relaxation, and aging. The addition of two-spin interactions, in general, destroys such behavior: the liquid phase loses metastability and the slow-dynamics regime terminates well below the melting transition, which is presumably related with a certain corner-rounding transition. However, for a particular choice of interaction constants, when the ground state is strongly degenerate, our simulations suggest that the slow-dynamics regime extends up to the melting transition. The analysis of these models leads us to the conjecture that in the four-spin Ising model domain walls lose their tension at the glassy transition and that they are basically tensionless in the glassy phase.
Resumo:
The method of instrumental variable (referred to as Mendelian randomization when the instrument is a genetic variant) has been initially developed to infer on a causal effect of a risk factor on some outcome of interest in a linear model. Adapting this method to nonlinear models, however, is known to be problematic. In this paper, we consider the simple case when the genetic instrument, the risk factor, and the outcome are all binary. We compare via simulations the usual two-stages estimate of a causal odds-ratio and its adjusted version with a recently proposed estimate in the context of a clinical trial with noncompliance. In contrast to the former two, we confirm that the latter is (under some conditions) a valid estimate of a causal odds-ratio defined in the subpopulation of compliers, and we propose its use in the context of Mendelian randomization. By analogy with a clinical trial with noncompliance, compliers are those individuals for whom the presence/absence of the risk factor X is determined by the presence/absence of the genetic variant Z (i.e., for whom we would observe X = Z whatever the alleles randomly received at conception). We also recall and illustrate the huge variability of instrumental variable estimates when the instrument is weak (i.e., with a low percentage of compliers, as is typically the case with genetic instruments for which this proportion is frequently smaller than 10%) where the inter-quartile range of our simulated estimates was up to 18 times higher compared to a conventional (e.g., intention-to-treat) approach. We thus conclude that the need to find stronger instruments is probably as important as the need to develop a methodology allowing to consistently estimate a causal odds-ratio.
Resumo:
This paper proposes new methodologies for evaluating out-of-sample forecastingperformance that are robust to the choice of the estimation window size. The methodologies involve evaluating the predictive ability of forecasting models over a wide rangeof window sizes. We show that the tests proposed in the literature may lack the powerto detect predictive ability and might be subject to data snooping across differentwindow sizes if used repeatedly. An empirical application shows the usefulness of themethodologies for evaluating exchange rate models' forecasting ability.
Resumo:
The choice network revenue management (RM) model incorporates customer purchase behavioras customers purchasing products with certain probabilities that are a function of the offeredassortment of products, and is the appropriate model for airline and hotel network revenuemanagement, dynamic sales of bundles, and dynamic assortment optimization. The underlyingstochastic dynamic program is intractable and even its certainty-equivalence approximation, inthe form of a linear program called Choice Deterministic Linear Program (CDLP) is difficultto solve in most cases. The separation problem for CDLP is NP-complete for MNL with justtwo segments when their consideration sets overlap; the affine approximation of the dynamicprogram is NP-complete for even a single-segment MNL. This is in contrast to the independentclass(perfect-segmentation) case where even the piecewise-linear approximation has been shownto be tractable. In this paper we investigate the piecewise-linear approximation for network RMunder a general discrete-choice model of demand. We show that the gap between the CDLP andthe piecewise-linear bounds is within a factor of at most 2. We then show that the piecewiselinearapproximation is polynomially-time solvable for a fixed consideration set size, bringing itinto the realm of tractability for small consideration sets; small consideration sets are a reasonablemodeling tradeoff in many practical applications. Our solution relies on showing that forany discrete-choice model the separation problem for the linear program of the piecewise-linearapproximation can be solved exactly by a Lagrangian relaxation. We give modeling extensionsand show by numerical experiments the improvements from using piecewise-linear approximationfunctions.