10 resultados para statistical methodology

em Deakin Research Online - Australia


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The study examines the research methodology of more than 200 empirical investigations of ethics in personal selling and sales management between 1980 and 2010. The review discusses the sources and authorship of the sales ethics research. To better understand the drivers of empirical sales ethics research, the foundations used in business, marketing, and sales ethics are compared. The use of hypotheses, operationalization, measurement, population and sampling decisions, research design, and statistical analysis techniques were examined as part of theory development and testing. The review establishes a benchmark, assesses the status and direction of the sales ethics research methodology, and helps inform researchers who need to deal with increasing amounts of empirical research. The investigation identified changing sources of publication with the Journal of Business Ethics and the Journal of Personal Selling & Sales Management maintaining their position as the main conduit of high quality empirical sales ethics research. The results suggest that despite the use of theoretical models for empirical testing, a greater variety of moral frameworks and wider use of marketing exchange theory is needed. The review highlights many sound aspects about the empirical sales ethics research statistical methodology but also raises concerns about several areas. Ways in which these concerns might be addressed and recommendations for researchers are provided. © 2013 Springer Science+Business Media Dordrecht.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Up until 1979, Multiple Discriminant Analysis (MDA) was the primary multivariate methodological approaches to ratio-based modelling of corporate collapse. However, as new statistical tools became available, researchers started testing them with the primary objective of deriving models that would at least do as good a job as MDA, but that rely on fewer assumptions. Regardless of which methodological approach was chosen, most were compared to MDA. This paper analyses 84 studies on ratio based modelling of corporate collapse over the period 1968 to 2004. The results indicate that when MDA was not the primary methodology it was the benchmark of choice for comparison; thereby, demonstrating its importance as a foundation multivariate methodological approach in signalling corporate collapse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a conveyor-based methodology to model complex vehicle flows common to factory and distribution warehouse facilities. The AGV and human path modelling techniques available in many commercial discrete event simulation packages require extensive knowledge and time to implement even the simplest flow control rules for multiple vehicle interaction. Although discrete event simulation is accepted as an effective tool to model vehicle delivery movements, human paths and delivery schedules for modern assembly lines, the time to generate accurate models is a significant limitation of existing simulation-based optimisation methodologies. The flow control method has been successfully implemented using two commercial simulation packages. It provides a realistic visual representation, as well as accurate statistical results, and reduces the model development process cost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Originally developed as a statistical tool for empirical research in accounting and finance, event studies have since migrated to other disciplines as well, including economics, history, law, management, marketing, and political science. Despite the elegant simplicity of a standard event study, variations in methodology and their relative merits continue to attract attention in the literature. This paper reviews some of the fundamental topics in short-term event study methodology, with an attempt to add new perspectives to some pressing topics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous experience and research indicated that the Pareto Principle (80/20 Principle) has been widely used in many industries to achieve more with less. The study described in this paper concurs that this principle can be applied to improve the estimating accuracy and efficiency, especially in design development stage of projects. In fact, establishing an effective cost estimating model to improve accuracy and efficiency in design development stage has been a subject, which has attracted many research attentions over several decades. For over almost 40 years, research studies indicate that using the 80/20 Principle is one of the approaches. However, most of these studies were built by assumption, theoretical analysis or questionnaire survey. The objective of this research is to explore a logical and systematic method to establish a cost estimating model based on the Pareto Principle. This paper includes extensive literatures review on cost estimating accuracy and efficiency in the construction industry that points out the current gap of knowledge area and understanding of the topical. These reviews assist in developing the direction for the research and explore the potential methodology of using the Pareto Principle in the new cost estimating model. The findings of this paper suggest that combining the Pareto Principle with statistical analysis could be used as the technique to improve the accuracy and efficiency of current estimating methods in design development stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In March 2011, the United Kingdom's (UK's) Government launched five Public Health Responsibility Deal Networks to address public health priorities. The Networks used voluntary partnerships to influence peoples' choice architecture to move them toward healthier behaviors. The purpose of this research was to conduct an exploratory study of diverse stakeholders' perspectives about perceived responsibility and accountability expectations to improve food environments in England through the Food Network partnerships. A purposive sample of policy elites (n=31) from government, academia, food industry and non-government organizations sorted 48 statements related to improving food environments in England. Statements were grounded in three theoretical perspectives (i.e., legitimacy, nudge and public health law). PQMethod 2.33 statistical software program used factor analysis to identify viewpoints based on intra-individual differences for how participants sorted statements. The results revealed three distinct viewpoints, which explained 64% of the variance for 31 participants, and emphasized different expectations about responsibility. The food environment protectors (n=17) underscored government responsibility to address unhealthy food environments if voluntary partnerships are ineffective; the partnership pioneers (n=12) recognized government-industry partnerships as legitimate and necessary to address unhealthy food environments; and the commercial market defenders (n=1) emphasized individual responsibility for food choices and rejected government intervention to improve food environments. Consensus issues included: protecting children's right to health; food industry practices that can and should be changed; government working with industry on product reformulation; and building consumer support for economically viable healthy products. Contentious issues were: inadequacy of accountability structures and government inaction to regulate food marketing practices targeting children. We conclude that understanding different viewpoints is a step toward building mutual trust to strengthen accountability structures that may help stakeholders navigate ideologically contentious issues to promote healthy food environments in England.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Retrieval systems with non-deterministic output are widely used in information retrieval. Common examples include sampling, approximation algorithms, or interactive user input. The effectiveness of such systems differs not just for different topics, but also for different instances of the system. The inherent variance presents a dilemma - What is the best way to measure the effectiveness of a non-deterministic IR system? Existing approaches to IR evaluation do not consider this problem, or the potential impact on statistical significance. In this paper, we explore how such variance can affect system comparisons, and propose an evaluation framework and methodologies capable of doing this comparison. Using the context of distributed information retrieval as a case study for our investigation, we show that the approaches provide a consistent and reliable methodology to compare the effectiveness of a non-deterministic system with a deterministic or another non-deterministic system. In addition, we present a statistical best-practice that can be used to safely show how a non-deterministic IR system has equivalent effectiveness to another IR system, and how to avoid the common pitfall of misusing a lack of significance as a proof that two systems have equivalent effectiveness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photo yellowing of wool is one of the most important problems which have negative impacts on various aspects of wool prompting scientists to find a solution over the past decades. In this research the protective features of nano-titanium dioxide particles against UV on wool fabric were discussed and the color variations of wool samples after UV irradiation were measured and reported. It was shown that nano TiO2 is a suitable UV absorber and its effect depends on the concentration. Also, it was assumed that butane tetracarboxylic acid plays a prominent role as a cross-linking agent to stabilize the nano-titanium dioxide as well as a polyanion to maintain negative charges on the wool surface for higher nano particles absorption. Also the variables conditions were optimized using response surface methodology (RSM).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a long-standing interest in behavioural ecology, exploring the causes and correlates of consistent individual differences in mean behavioural traits ('personality') and the response to the environment ('plasticity'). Recently, it has been observed that individuals also consistently differ in their residual intraindividual variability (rIIV). This variation will probably have broad biological and methodological implications to the study of trait variation in labile traits, such as behaviour and physiology, though we currently need studies to quantify variation in rIIV, using more standardized and powerful methodology. Focusing on activity rates in guppies (Poecilia reticulata), we provide a model example, from sampling design to data analysis, in how to quantify rIIV in labile traits. Building on the doubly hierarchical generalized linear model recently used to quantify individual differences in rIIV, we extend the model to evaluate the covariance between individual mean values and their rIIV. After accounting for time-related change in behaviour, our guppies substantially differed in rIIV, and it was the active individuals that tended to be more consistent (lower rIIV). We provide annotated data analysis code to implement these complex models, and discuss how to further generalize the model to evaluate covariances with other aspects of phenotypic variation.