894 resultados para A-Convergence
Resumo:
Focusing on the conditions that an optimization problem may comply with, the so-called convergence conditions have been proposed and sequentially a stochastic optimization algorithm named as DSZ algorithm is presented in order to deal with both unconstrained and constrained optimizations. The principle is discussed in the theoretical model of DSZ algorithm, from which we present the practical model of DSZ algorithm. Practical model efficiency is demonstrated by the comparison with the similar algorithms such as Enhanced simulated annealing (ESA), Monte Carlo simulated annealing (MCS), Sniffer Global Optimization (SGO), Directed Tabu Search (DTS), and Genetic Algorithm (GA), using a set of well-known unconstrained and constrained optimization test cases. Meanwhile, further attention goes to the strategies how to optimize the high-dimensional unconstrained problem using DSZ algorithm.
Supply chain sustainability : a relationship management approach moderated by culture and commitment
Resumo:
This research explores the nature of relationship management on construction projects in Australia and examines the effects of culture, by means of Schwarz’s value survey, on relationships under different contract strategies. The research was based on the view that the development of a sustainable supply chain depends on the transfer of knowledge and capabilities from the larger players in the supply chain through collaboration brought about by relationship management. The research adopted a triangulated approach in which quantitative data were collected by questionnaire, interviews were conducted to explore and enrich the quantitative data and case studies were undertaken in order to illustrate and validate the findings. The aim was to investigate how values and attitudes enhance or reduce the incorporation of the supply chain into the project. From the research it was found that the degree of match and mismatch between values and contract strategy impacts commitment and the engagement and empowerment of the supply chain.
Resumo:
The last two decades have seen a significant restructuring of work across Australia and other industrialised economies, a critical part of which has been the appearance of competency based education and assessment. The competency movement is about creating a more flexible and mobile labour force to increase productivity and it does so by redefining work as a set of transferable or ‘soft’ generic skills that are transportable and are the possession of the individual. This article sought to develop an analysis of competency based clinical assessment of nursing students across a bachelor of nursing degree course. This involved an examination of a total of 406 clinical assessment tools that covered the years 1992-2009 and the three years of a bachelor degree. Data analysis generated three analytical findings: the existence of a hierarchy of competencies that prioritises soft skills over intellectual and technical skills; the appearance of skills as personal qualities or individual attributes; and the absence of context in assessment. The article argues that the convergence in nursing of soft skills and the professionalisation project reform has seen the former give legitimacy to the enduring invisibility and devaluation of nursing work.
Resumo:
In this paper, we consider the variable-order Galilei advection diffusion equation with a nonlinear source term. A numerical scheme with first order temporal accuracy and second order spatial accuracy is developed to simulate the equation. The stability and convergence of the numerical scheme are analyzed. Besides, another numerical scheme for improving temporal accuracy is also developed. Finally, some numerical examples are given and the results demonstrate the effectiveness of theoretical analysis. Keywords: The variable-order Galilei invariant advection diffusion equation with a nonlinear source term; The variable-order Riemann–Liouville fractional partial derivative; Stability; Convergence; Numerical scheme improving temporal accuracy
Resumo:
We examined properties of culture-level personality traits in ratings of targets (N=5,109) ages 12 to 17 in 24 cultures. Aggregate scores were generalizable across gender, age, and relationship groups and showed convergence with culture-level scores from previous studies of self-reports and observer ratings of adults, but they were unrelated to national character stereotypes. Trait profiles also showed cross-study agreement within most cultures, 8 of which had not previously been studied. Multidimensional scaling showed that Western and non-Western cultures clustered along a dimension related to Extraversion. A culture-level factor analysis replicated earlier findings of a broad Extraversion factor but generally resembled the factor structure found in individuals. Continued analysis of aggregate personality scores is warranted.
Resumo:
In this this paper I identify specific historical trajectories that are directly contingent upon the deployment and use of new media, but which are actually hidden by a focus on the purely technological. They are: the increasingly abstract and alienated nature of economic value; the subsumption of all labour - material and intellectual - under systemic capital; and the convergence of formerly distinct spheres of analysis –the spheres of production, circulation, and consumption. This paper examines the implications of the knowledge economy from an historical materialist perspective. I synthesise the systemic views of Marx (1846/1972, 1875/1972 1970 1973 1976 1978 1981), Adorno (1951/1974 1964/1973 1991; Horkheimer and Adorno 1944/1998; Jarvis 1998), and Bourdieu (1991 1998) to argue for a language-focused approach to new media research and suggest aspects of Marxist thought which might be useful in researching emergent socio-technical domains. I also identify specific categories in the Marxist tradition which may no longer be analytically useful for researching the effects of new media.
Resumo:
Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.