900 resultados para INTERNATIONAL CLASSIFICATION
Resumo:
Immigrant Entrepreneurs (IE) are often portrayed as pushed into self-employment due to employment barriers in their adopted countries. But IE have human resources, like international experience, which can help them form international new ventures (INV). We question the role of IE in INV. We use randomly selected data from 561 young firms from the Comprehensive Australian Study of Entrepreneurial Emergence (CAUSEE) project. We find that IE are over-represented in INV and have many characteristics known to facilitate INV success including more founders, university degree, international connections and technical capability. These findings are relevant to policy makers, and nascent IE.
Resumo:
In pre-Fitzgerald Queensland, the existence of corruption was widely known but its extent and modes of operation were not fully evident. The Fitzgerald Report identified the need for reform of the structure, procedures and efficiency in public administration in Queensland. What was most striking in the Queensland reform process was that a new model for combating corruption had been developed. Rather than rely upon a single law and a single institution, existing institutions were strengthened and new institutions were instituted to create a set of mutually supporting and mutually checking institutions, agencies and laws that jointly sought to improve governmental standards and combat corruption. Some of the reforms were either unique to Queensland or very rare. One of the strengths of this approach was that it avoided creating a single overarching institution to fight corruption. There are many powerful opponents of reform. Influential institutions and individuals resist any interference with their privileges. In order to cause a mass exodus from an entrenched corruption system, a seminal event or defining process is needed to alter expectations and incentives that are sufficient to encourage significant numbers of individuals to desert the corruption system and assist the integrity system in exposing and destroying it. The Fitzgerald Inquiry was such an event. The article also briefly addresses methods for destroying national corruption system where they emerge and exist.
Resumo:
The emergence of strong sovereign states after the Treaty of Westphalia turned two of the most cosmopolitan professions (law and arms) into two of the least cosmopolitan. Sovereign states determined the content of the law within their borders – including which, if any, ecclesiastical law was to be applied; what form of economic regulation was adopted; and what, if any, international law applied. Similarly, states sought to ensure that all military force was at their disposal in national armies. The erosion of sovereignty in a post-Westphalian world may significantly reverse these processes. The erosion of sovereignty is likely to have profound consequences for the legal profession and the ethics of how, and for what ends, it is practised. Lawyers have played a major role in the civilization of sovereign states through the articulation and institutionalisation of key governance values – starting with the rule of law. An increasingly global profession must take on similar tasks. The same could be said of the military. This essay will review the concept of an international rule of law and its relationship to domestic conceptions and outline the task of building the international rule of law and the role that lawyers can and should play in it.
Resumo:
Given the serious nature of computer crime, and its global nature and implications, it is clear that there is a crucial need for a common understanding of such criminal activity internationally in order to deal with it effectively. Research into the extent to which legislation, international initiatives, and policy and procedures to combat and investigate computer crime are consistent globally is therefore of enormous importance. The challenge is to study, analyse, and compare the policies and practices of combating computer crime under different jurisdictions in order to identify the extent to which they are consistent with each other and with international guidelines; and the extent of their successes and limitations. The purpose ultimately is to identify areas where improvements are needed and what those improvements should be. This thesis examines approaches used for combating computer crime, including money laundering, in Australia, the UAE, the UK and the USA, four countries which represent a spectrum of economic development and culture. It does so in the context of the guidelines of international organizations such as the Council of Europe (CoE) and the Financial Action Task Force (FATF). In the case of the UAE, we examine also the cultural influences which differentiate it from the other three countries and which has necessarily been a factor in shaping its approaches for countering money laundering in particular. The thesis concludes that because of the transnational nature of computer crime there is a need internationally for further harmonisation of approaches for combating computer crime. The specific contributions of the thesis are as follows: „h Developing a new unified comprehensive taxonomy of computer crime based upon the dual characteristics of the role of the computer and the contextual nature of the crime „h Revealing differences in computer crime legislation in Australia, the UAE, the UK and the USA, and how they correspond to the CoE Convention on Cybercrime and identifying a new framework to develop harmonised computer crime or cybercrime legislation globally „h Identifying some important issues that continue to create problems for law enforcement agencies such as insufficient resources, coping internationally with computer crime legislation that differs between countries, having comprehensive documented procedures and guidelines for combating computer crime, and reporting and recording of computer crime offences as distinct from other forms of crime „h Completing the most comprehensive study currently available regarding the extent of money laundered in four such developed or fast developing countries „h Identifying that the UK and the USA are the most advanced with regard to anti-money laundering and combating the financing of terrorism (AML/CFT) systems among the four countries based on compliance with the FATF recommendations. In addition, the thesis has identified that local factors have affected how the UAE has implemented its financial and AML/CFT systems and reveals that such local and cultural factors should be taken into account when implementing or evaluating any country¡¦s AML/CFT system.
Resumo:
Purpose – The purpose of this paper is to present a selection of responses to the report Fashion Victims, published by War on Want in December 2006. It offers a range of viewpoints presented by members of the Editorial Advisory Board of CPOIB. These are presented in chronological order of submission. There is some cross-reference by contributors to the work of others, but no attempt is made to present a unified argument. Design/methodology/approach – Presents the full contributions of involved participants, without mediation or editorial change. Findings – A number of different perspectives are presented on the central issue that is summarised by the opening heading in War on Want’s report – “How cheap is too cheap?” It is seen that the answer to this question is very much dependent upon the standpoint of the respondent. Originality/value – In presenting this form of commentary, members of the CPOIB Editorial Board seek to stimulate debate about an issue of concern to contemporary society, without resort to the time delay and mediating processes of peer-review normally attached to academic writing. It is hoped that this discussion will provoke further contributions and a widening of the debate. Keywords Corporate social responsibility, Multinational companies, Conditions of employment, Trade unions
Resumo:
Sample complexity results from computational learning theory, when applied to neural network learning for pattern classification problems, suggest that for good generalization performance the number of training examples should grow at least linearly with the number of adjustable parameters in the network. Results in this paper show that if a large neural network is used for a pattern classification problem and the learning algorithm finds a network with small weights that has small squared error on the training patterns, then the generalization performance depends on the size of the weights rather than the number of weights. For example, consider a two-layer feedforward network of sigmoid units, in which the sum of the magnitudes of the weights associated with each unit is bounded by A and the input dimension is n. We show that the misclassification probability is no more than a certain error estimate (that is related to squared error on the training set) plus A3 √((log n)/m) (ignoring log A and log m factors), where m is the number of training patterns. This may explain the generalization performance of neural networks, particularly when the number of training examples is considerably smaller than the number of weights. It also supports heuristics (such as weight decay and early stopping) that attempt to keep the weights small during training. The proof techniques appear to be useful for the analysis of other pattern classifiers: when the input domain is a totally bounded metric space, we use the same approach to give upper bounds on misclassification probability for classifiers with decision boundaries that are far from the training examples.
Resumo:
Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.
Resumo:
Binary classification methods can be generalized in many ways to handle multiple classes. It turns out that not all generalizations preserve the nice property of Bayes consistency. We provide a necessary and sufficient condition for consistency which applies to a large class of multiclass classification methods. The approach is illustrated by applying it to some multiclass methods proposed in the literature.
Resumo:
We consider the problem of binary classification where the classifier can, for a particular cost, choose not to classify an observation. Just as in the conventional classification problem, minimization of the sample average of the cost is a difficult optimization problem. As an alternative, we propose the optimization of a certain convex loss function φ, analogous to the hinge loss used in support vector machines (SVMs). Its convexity ensures that the sample average of this surrogate loss can be efficiently minimized. We study its statistical properties. We show that minimizing the expected surrogate loss—the φ-risk—also minimizes the risk. We also study the rate at which the φ-risk approaches its minimum value. We show that fast rates are possible when the conditional probability P(Y=1|X) is unlikely to be close to certain critical values.
Resumo:
Binary classification is a well studied special case of the classification problem. Statistical properties of binary classifiers, such as consistency, have been investigated in a variety of settings. Binary classification methods can be generalized in many ways to handle multiple classes. It turns out that one can lose consistency in generalizing a binary classification method to deal with multiple classes. We study a rich family of multiclass methods and provide a necessary and sufficient condition for their consistency. We illustrate our approach by applying it to some multiclass methods proposed in the literature.
Resumo:
The purpose of this conceptual paper is to address the lack of consistent means through which strategies are identified and discussed across theoretical perspectives in the field of business strategy. A standardised referencing system is offered to codify the means by which strategies can be identified, from which new business services and information systems may be derived. This taxonomy was developed using qualitative content analysis study of government agencies’ strategic plans. This taxonomy is useful for identifying strategy formation and determining gaps and opportunities. Managers will benefit from a more transparent strategic design process that reduces ambiguity, aids in identifying and correcting gaps in strategy formulation, and fosters enhanced strategic analysis. Key benefits to academics are the improved dialogue in strategic management field and suggest that progress in the field requires that fundamentals of strategy formulation and classification be considered more carefully. Finally, the formalization of strategy can lead to the clear identification of new business services, which inform ICT investment decisions and shared service prioritisation.
Resumo:
We consider the problem of structured classification, where the task is to predict a label y from an input x, and y has meaningful internal structure. Our framework includes supervised training of Markov random fields and weighted context-free grammars as special cases. We describe an algorithm that solves the large-margin optimization problem defined in [12], using an exponential-family (Gibbs distribution) representation of structured objects. The algorithm is efficient—even in cases where the number of labels y is exponential in size—provided that certain expectations under Gibbs distributions can be calculated efficiently. The method for structured labels relies on a more general result, specifically the application of exponentiated gradient updates [7, 8] to quadratic programs.
Resumo:
This study determines whether the inclusion of low-cost airlines in a dataset of international and domestic airlines has an impact on the efficiency scores of so-called ‘prestigious’ purportedly ‘efficient’ airlines. This is because while many airline studies concern efficiency, none has truly included a combination of international, domestic and budget airlines. The present study employs the nonparametric technique of data envelopment analysis (DEA) to investigate the technical efficiency of 53 airlines in 2006. The findings reveal that the majority of budget airlines are efficient relative to their more prestigious counterparts. Moreover, most airlines identified as inefficient are so largely because of the overutilization of non-flight assets.
Resumo:
Background: Strategies for cancer reduction and management are targeted at both individual and area levels. Area-level strategies require careful understanding of geographic differences in cancer incidence, in particular the association with factors such as socioeconomic status, ethnicity and accessibility. This study aimed to identify the complex interplay of area-level factors associated with high area-specific incidence of Australian priority cancers using a classification and regression tree (CART) approach. Methods: Area-specific smoothed standardised incidence ratios were estimated for priority-area cancers across 478 statistical local areas in Queensland, Australia (1998-2007, n=186,075). For those cancers with significant spatial variation, CART models were used to identify whether area-level accessibility, socioeconomic status and ethnicity were associated with high area-specific incidence. Results: The accessibility of a person’s residence had the most consistent association with the risk of cancer diagnosis across the specific cancers. Many cancers were likely to have high incidence in more urban areas, although male lung cancer and cervical cancer tended to have high incidence in more remote areas. The impact of socioeconomic status and ethnicity on these associations differed by type of cancer. Conclusions: These results highlight the complex interactions between accessibility, socioeconomic status and ethnicity in determining cancer incidence risk.
Resumo:
Facial expression recognition (FER) algorithms mainly focus on classification into a small discrete set of emotions or representation of emotions using facial action units (AUs). Dimensional representation of emotions as continuous values in an arousal-valence space is relatively less investigated. It is not fully known whether fusion of geometric and texture features will result in better dimensional representation of spontaneous emotions. Moreover, the performance of many previously proposed approaches to dimensional representation has not been evaluated thoroughly on publicly available databases. To address these limitations, this paper presents an evaluation framework for dimensional representation of spontaneous facial expressions using texture and geometric features. SIFT, Gabor and LBP features are extracted around facial fiducial points and fused with FAP distance features. The CFS algorithm is adopted for discriminative texture feature selection. Experimental results evaluated on the publicly accessible NVIE database demonstrate that fusion of texture and geometry does not lead to a much better performance than using texture alone, but does result in a significant performance improvement over geometry alone. LBP features perform the best when fused with geometric features. Distributions of arousal and valence for different emotions obtained via the feature extraction process are compared with those obtained from subjective ground truth values assigned by viewers. Predicted valence is found to have a more similar distribution to ground truth than arousal in terms of covariance or Bhattacharya distance, but it shows a greater distance between the means.