955 resultados para Generalization
Resumo:
The problem of recognition on finite set of events is considered. The generalization ability of classifiers for this problem is studied within the Bayesian approach. The method for non-uniform prior distribution specification on recognition tasks is suggested. It takes into account the assumed degree of intersection between classes. The results of the analysis are applied for pruning of classification trees.
Resumo:
2000 Mathematics Subject Classification: 33C10, 33-02, 60K25
Resumo:
Снежана Христова, Кремена Стефанова, Лиляна Ванкова - В работата са решени няколко нови видове линейни дискретни неравенства, които съдържат максимума на неизвестната функция в отминал интервал от време. Някои от тези неравенства са приложени за изучаване непрекъснатата зависимост от смущения при дискретни уравнения с максимуми.
Resumo:
2000 Mathematics Subject Classification: 51E14, 51E30.
Resumo:
MSC 2010: 54A25, 54A35.
Resumo:
MSC 2010: 33C20
Resumo:
2000 Mathematics Subject Classification: Primary 46B20. Secondary 47A99, 46B42.
Resumo:
A new correlation scheme (leading to a special equilibrium called “soft” correlated equilibrium) is introduced for finite games. After randomization over the outcome space, players have the choice either to follow the recommendation of an umpire blindly or freely choose some other action except the one suggested. This scheme can lead to Pareto-better outcomes than the simple extension introduced by [Moulin, H., Vial, J.-P., 1978. Strategically zero-sum games: the class of games whose completely mixed equilibria cannot be improved upon. International Journal of Game Theory 7, 201–221]. The informational and interpretational aspects of soft correlated equilibria are also discussed in detail. The power of the generalization is illustrated in the prisoners’s dilemma and a congestion game.
Resumo:
Network analysis has emerged as a key technique in communication studies, economics, geography, history and sociology, among others. A fundamental issue is how to identify key nodes in a network, for which purpose a number of centrality measures have been developed. This paper proposes a new parametric family of centrality measures called generalized degree. It is based on the idea that a relationship to a more interconnected node contributes to centrality in a greater extent than a connection to a less central one. Generalized degree improves on degree by redistributing its sum over the network with the consideration of the global structure. Application of the measure is supported by a set of basic properties. A sufficient condition is given for generalized degree to be rank monotonic, excluding counter-intuitive changes in the centrality ranking after certain modifications of the network. The measure has a graph interpretation and can be calculated iteratively. Generalized degree is recommended to apply besides degree since it preserves most favorable attributes of degree, but better reflects the role of the nodes in the network and has an increased ability to distinguish between their importance.
Resumo:
This study employs BP neural network to simulate the development of Chinese private passenger cars. Considering the uncertain and complex environment for the development of private passenger cars, indicators of economy, population, price, infrastructure, income, energy and some other fields which have major impacts on it are selected at first. The network is proved to be operable to simulate the progress of chinese private passenger cars after modeling, training and generalization test. Based on the BP neural network model, sensitivity analysis of each indicator is carried on and shows that the sensitivity coefficients of fuel price change suddenly. This special phenomenon reveals that the development of Chinese private passenger cars may be seriously affected by the recent high fuel price. This finding is also consistent with facts and figures
Resumo:
Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.
Resumo:
There is little evidence that workshops alone have a lasting impact on the day-to-day practice of participants. The current paper examined a strategy to increase generalization and maintenance of skills in the natural environment using pseudo-patients and immediate performance feedback to reinforce skills acquisition. A random half of pharmacies (N=30) took part in workshop training aimed at optimizing consumers' use of nonprescription analgesic products. Pharmacies in the training group also received performance feedback on their adherence to the recommended protocol. Feedback occurred immediately after a pseudo-patient visit in which confederates posed as purchasers of analgesics, and combined positive and corrective elements. Trained pharmacists were significantly more accurate at identifying people who misused the medication (P<0.001). The trained pharmacists were more likely than controls to use open-ended questions (P<0.001), assess readiness to change problematic use (P <0.001), and to deliver a brief intervention that was tailored to the person's commitment to alter his/her usage (P <0.001). Participants responded to the feedback positively. Results were consistent with the hypothesis that when workshop is combined with on-site performance feedback, it enhances practitioners' adherence to protocols in the natural setting.
Resumo:
The selection criteria for contractor pre-qualification are characterized by the co-existence of both quantitative and qualitative data. The qualitative data is non-linear, uncertain and imprecise. An ideal decision support system for contractor pre-qualification should have the ability of handling both quantitative and qualitative data, and of mapping the complicated nonlinear relationship of the selection criteria, such that rational and consistent decisions can be made. In this research paper, an artificial neural network model was developed to assist public clients identifying suitable contractors for tendering. The pre-qualification criteria (variables) were identified for the model. One hundred and twelve real pre-qualification cases were collected from civil engineering projects in Hong Kong, and eighty-eight hypothetical pre-qualification cases were also generated according to the “If-then” rules used by professionals in the pre-qualification process. The results of the analysis totally comply with current practice (public developers in Hong Kong). Each pre-qualification case consisted of input ratings for candidate contractors’ attributes and their corresponding pre-qualification decisions. The training of the neural network model was accomplished by using the developed program, in which a conjugate gradient descent algorithm was incorporated for improving the learning performance of the network. Cross-validation was applied to estimate the generalization errors based on the “re-sampling” of training pairs. The case studies show that the artificial neural network model is suitable for mapping the complicated nonlinear relationship between contractors’ attributes and their corresponding pre-qualification (disqualification) decisions. The artificial neural network model can be concluded as an ideal alternative for performing the contractor pre-qualification task.
Resumo:
We propose that a general analytic framework for cultural science can be constructed as a generalization of the generic micro meso macro framework proposed by Dopfer and Potts (2008). This paper outlines this argument along with some implications for the creative industries research agenda.
Resumo:
This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the influential work of Blanchard and Quah [1989. The dynamic effects of aggregate demand and supply disturbances. The American Economic Review 79, 655–673], and shows that structural equations with known permanent shocks cannot contain error correction terms, thereby freeing up the latter to be used as instruments in estimating their parameters. The approach is illustrated by a re-examination of the identification schemes used by Wickens and Motto [2001. Estimating shocks and impulse response functions. Journal of Applied Econometrics 16, 371–387], Shapiro and Watson [1988. Sources of business cycle fluctuations. NBER Macroeconomics Annual 3, 111–148], King et al. [1991. Stochastic trends and economic fluctuations. American Economic Review 81, 819–840], Gali [1992. How well does the ISLM model fit postwar US data? Quarterly Journal of Economics 107, 709–735; 1999. Technology, employment, and the business cycle: Do technology shocks explain aggregate fluctuations? American Economic Review 89, 249–271] and Fisher [2006. The dynamic effects of neutral and investment-specific technology shocks. Journal of Political Economy 114, 413–451].