969 resultados para Sophisticated voting


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research compares Chinese HRM with Western HRM, particularly in the areas of development of HR information systems (HRIS) and HR measurement systems and their relation to HR’s involvement as a strategic partner in firms. The research uses a 3-stage model of HRIS (workforce profiling, business insight, and strategic driver) based on studies of Irmer and Ellerby (2005) and Boudreau and Ramstad (2003) to compare the relative stages of development of Chinese and Western HRM. The quantitative aspect of the study comprises a survey of senior HR practitioners from 171 Chinese firms whose data is compared with data from Irmer and Ellerby’s study of Australian and U.S. HRM (2005) and Lawler et al’s series of studies of U.S firms (1995, 1998, 2001, 2004). The main results of the comparison are that Chinese HRM generally lags behind Western HRM. In particular, Chinese HR professionals allocate less time to strategic activities and their roles are less strategic than those of Western HR professionals. The HR measurement systems of Chinese firms are more limited in function, and the HR information systems of Chinese companies are less automated and integrated. However there is also evidence of a “two speed” HR system in China with a small proportion of firms having highly sophisticated HR systems but with a much larger proportion of Chinese firms than in the West having only the most basic HR information systems. This ‘two speed” system is in part attributable to a split between the relatively advanced HR systems of large State Owned Enterprises and the basic systems that predominate in smaller, growing Local Private firms. The survey study is complemented by a series of interviews with a number of senior Chinese HR practitioners who provide richer insights into their experiences and the challenges they face in contemporary Chinese firms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Major construction sites in Australia have an above average presence of ethnic minorities. These groups and the interfaces between them require effective management in order to meet the social imperatives of sustainable design and construction. A survey of 1155 workers and 204 managers on Sydney construction sites respectively, found a significant level of normalisation of negative forms of cross cultural interaction. Yet it was also found that anti-racism programs are not currently a management priority and that they generally lack sophisticated community relations aspects. This paper presents the results of a desk-top study of leading global companies within and outside the construction sector which have won international awards and recognition for their cultural diversity strategies. A key insight is that the companies profiled see diversity as a key resource and as an opportunity rather than a risk which is best harnessed through long-term and on-going commitment of senior management. These leading companies also recognise that cultural diversity strategies operate at three levels - in terms of its relationship with its own workforce; its relationship with its clients and; its relationships with the communities in which it operates - and if properly managed it can be a source of competitive advantage.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Small animal fracture models have gained increasing interest in fracture healing studies. To achieve standardized and defined study conditions, various variables must be carefully controlled when designing fracture healing experiments in mice or rats. The strain, age and sex of the animals may influence the process of fracture healing. Furthermore, the choice of the fracture fixation technique depends on the questions addressed, whereby intra- and extramedullary implants as well as open and closed surgical approaches may be considered. During the last few years, a variety of different, highly sophisticated implants for fracture fixation in small animals have been developed. Rigid fixation with locking plates or external fixators results in predominantly intramembranous healing in both mice and rats. Locking plates, external fixators, intramedullary screws, the locking nail and the pin-clip device allow different degrees of stability resulting in various amounts of endochondral and intramembranous healing. The use of common pins that do not provide rotational and axial stability during fracture stabilization should be discouraged in the future. Analyses should include at least biomechanical and histological evaluations, even if the focus of the study is directed towards the elucidation of molecular mechanisms of fracture healing using the largely available spectrum of antibodies and gene-targeted animals to study molecular mechanisms of fracture healing. This review discusses distinct requirements for the experimental setups as well as the advantages and pitfalls of the different fixation techniques in rats and mice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How do humans respond to their social context? This question is becoming increasingly urgent in a society where democracy requires that the citizens of a country help to decide upon its policy directions, and yet those citizens frequently have very little knowledge of the complex issues that these policies seek to address. Frequently, we find that humans make their decisions more with reference to their social setting, than to the arguments of scientists, academics, and policy makers. It is broadly anticipated that the agent based modelling (ABM) of human behaviour will make it possible to treat such social effects, but we take the position here that a more sophisticated treatment of context will be required in many such models. While notions such as historical context (where the past history of an agent might affect its later actions) and situational context (where the agent will choose a different action in a different situation) abound in ABM scenarios, we will discuss a case of a potentially changing context, where social effects can have a strong influence upon the perceptions of a group of subjects. In particular, we shall discuss a recently reported case where a biased worm in an election debate led to significant distortions in the reports given by participants as to who won the debate (Davis et al 2011). Thus, participants in a different social context drew different conclusions about the perceived winner of the same debate, with associated significant differences among the two groups as to who they would vote for in the coming election. We extend this example to the problem of modelling the likely electoral responses of agents in the context of the climate change debate, and discuss the notion of interference between related questions that might be asked of an agent in a social simulation that was intended to simulate their likely responses. A modelling technology which could account for such strong social contextual effects would benefit regulatory bodies which need to navigate between multiple interests and concerns, and we shall present one viable avenue for constructing such a technology. A geometric approach will be presented, where the internal state of an agent is represented in a vector space, and their social context is naturally modelled as a set of basis states that are chosen with reference to the problem space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To sustain an ongoing rapid growth of video information, there is an emerging demand for a sophisticated content-based video indexing system. However, current video indexing solutions are still immature and lack of any standard. This doctoral consists of a research work based on an integrated multi-modal approach for sports video indexing and retrieval. By combining specific features extractable from multiple audio-visual modalities, generic structure and specific events can be detected and classified. During browsing and retrieval, users will benefit from the integration of high-level semantic and some descriptive mid-level features such as whistle and close-up view of player(s).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Language Modeling (LM) has been successfully applied to Information Retrieval (IR). However, most of the existing LM approaches only rely on term occurrences in documents, queries and document collections. In traditional unigram based models, terms (or words) are usually considered to be independent. In some recent studies, dependence models have been proposed to incorporate term relationships into LM, so that links can be created between words in the same sentence, and term relationships (e.g. synonymy) can be used to expand the document model. In this study, we further extend this family of dependence models in the following two ways: (1) Term relationships are used to expand query model instead of document model, so that query expansion process can be naturally implemented; (2) We exploit more sophisticated inferential relationships extracted with Information Flow (IF). Information flow relationships are not simply pairwise term relationships as those used in previous studies, but are between a set of terms and another term. They allow for context-dependent query expansion. Our experiments conducted on TREC collections show that we can obtain large and significant improvements with our approach. This study shows that LM is an appropriate framework to implement effective query expansion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intuitively, any ‘bag of words’ approach in IR should benefit from taking term dependencies into account. Unfortunately, for years the results of exploiting such dependencies have been mixed or inconclusive. To improve the situation, this paper shows how the natural language properties of the target documents can be used to transform and enrich the term dependencies to more useful statistics. This is done in three steps. The term co-occurrence statistics of queries and documents are each represented by a Markov chain. The paper proves that such a chain is ergodic, and therefore its asymptotic behavior is unique, stationary, and independent of the initial state. Next, the stationary distribution is taken to model queries and documents, rather than their initial distributions. Finally, ranking is achieved following the customary language modeling paradigm. The main contribution of this paper is to argue why the asymptotic behavior of the document model is a better representation then just the document’s initial distribution. A secondary contribution is to investigate the practical application of this representation in case the queries become increasingly verbose. In the experiments (based on Lemur’s search engine substrate) the default query model was replaced by the stable distribution of the query. Just modeling the query this way already resulted in significant improvements over a standard language model baseline. The results were on a par or better than more sophisticated algorithms that use fine-tuned parameters or extensive training. Moreover, the more verbose the query, the more effective the approach seems to become.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the growth and development of communication technology there is an increasing need for the use of interception technologies in modern policing. Law enforcement agencies are faced with increasingly sophisticated and complex criminal networks that utilise modern communication technology as a basis for their criminal success. In particular, transnational organised crime (TOC) is a diverse and complicated arena, costing global society in excess of $3 trillion annually, a figure that continues to grow (Borger, 2007) as crime groups take advantage of disappearing borders and greater profit markets. However, whilst communication can be a critical success factor for criminal enterprise it is also a key vulnerability. It is this vulnerability that the use of CIT, such as phone taps or email interception, can exploit. As such, law enforcement agencies now need a method and framework that allows them to utilise CIT to combat these crimes efficiently and successfully. This paper provides a review of current literature with the specific purpose of considering the effectiveness of CIT in the fight against TOC and the groundwork that must be laid in order for it to be fully exploited. In doing so, it fills an important gap in current research, focusing on the practical implementation of CIT as opposed to the traditional area of privacy concerns that arise with intrusive methods of investigation. The findings support the notion that CIT is an essential intelligence gathering tool that has a strong place within the modern policing arena. It identifies that the most effective use of CIT is grounded within a proactive, intelligence‐led framework and concludes that in order for this to happen Australian authorities and law enforcement agencies must re‐evaluate and address the current legislative and operational constraints placed on the use of CIT and the culture that surrounds intelligence in policing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Web has become a worldwide repository of information which individuals, companies, and organizations utilize to solve or address various information problems. Many of these Web users utilize automated agents to gather this information for them. Some assume that this approach represents a more sophisticated method of searching. However, there is little research investigating how Web agents search for online information. In this research, we first provide a classification for information agent using stages of information gathering, gathering approaches, and agent architecture. We then examine an implementation of one of the resulting classifications in detail, investigating how agents search for information on Web search engines, including the session, query, term, duration and frequency of interactions. For this temporal study, we analyzed three data sets of queries and page views from agents interacting with the Excite and AltaVista search engines from 1997 to 2002, examining approximately 900,000 queries submitted by over 3,000 agents. Findings include: (1) agent sessions are extremely interactive, with sometimes hundreds of interactions per second (2) agent queries are comparable to human searchers, with little use of query operators, (3) Web agents are searching for a relatively limited variety of information, wherein only 18% of the terms used are unique, and (4) the duration of agent-Web search engine interaction typically spans several hours. We discuss the implications for Web information agents and search engines.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software forms an important part of the interface between citizens and their government. An increasing amount of government functions are being performed, controlled, or delivered electronically. This software, like all language, is never value-neutral, but must, to some extent, reflect the values of the coder and proprietor. The move that many governments are making towards e-governance, and the increasing reliance that is being placed upon software in government, necessitates a rethinking of the relationships of power and control that are embodied in software.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Open pit mine operations are complex businesses that demand a constant assessment of risk. This is because the value of a mine project is typically influenced by many underlying economic and physical uncertainties, such as metal prices, metal grades, costs, schedules, quantities, and environmental issues, among others, which are not known with much certainty at the beginning of the project. Hence, mining projects present a considerable challenge to those involved in associated investment decisions, such as the owners of the mine and other stakeholders. In general terms, when an option exists to acquire a new or operating mining project, , the owners and stock holders of the mine project need to know the value of the mining project, which is the fundamental criterion for making final decisions about going ahead with the venture capital. However, obtaining the mine project’s value is not an easy task. The reason for this is that sophisticated valuation and mine optimisation techniques, which combine advanced theories in geostatistics, statistics, engineering, economics and finance, among others, need to be used by the mine analyst or mine planner in order to assess and quantify the existing uncertainty and, consequently, the risk involved in the project investment. Furthermore, current valuation and mine optimisation techniques do not complement each other. That is valuation techniques based on real options (RO) analysis assume an expected (constant) metal grade and ore tonnage during a specified period, while mine optimisation (MO) techniques assume expected (constant) metal prices and mining costs. These assumptions are not totally correct since both sources of uncertainty—that of the orebody (metal grade and reserves of mineral), and that about the future behaviour of metal prices and mining costs—are the ones that have great impact on the value of any mining project. Consequently, the key objective of this thesis is twofold. The first objective consists of analysing and understanding the main sources of uncertainty in an open pit mining project, such as the orebody (in situ metal grade), mining costs and metal price uncertainties, and their effect on the final project value. The second objective consists of breaking down the wall of isolation between economic valuation and mine optimisation techniques in order to generate a novel open pit mine evaluation framework called the ―Integrated Valuation / Optimisation Framework (IVOF)‖. One important characteristic of this new framework is that it incorporates the RO and MO valuation techniques into a single integrated process that quantifies and describes uncertainty and risk in a mine project evaluation process, giving a more realistic estimate of the project’s value. To achieve this, novel and advanced engineering and econometric methods are used to integrate financial and geological uncertainty into dynamic risk forecasting measures. The proposed mine valuation/optimisation technique is then applied to a real gold disseminated open pit mine deposit to estimate its value in the face of orebody, mining costs and metal price uncertainties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research deals with an innovative methodology for optimising the coal train scheduling problem. Based on our previously published work, generic solution techniques are developed by utilising a “toolbox” of standard well-solved standard scheduling problems. According to our analysis, the coal train scheduling problem can be basically modelled a Blocking Parallel-Machine Job-Shop Scheduling (BPMJSS) problem with some minor constraints. To construct the feasible train schedules, an innovative constructive algorithm called the SLEK algorithm is proposed. To optimise the train schedule, a three-stage hybrid algorithm called the SLEK-BIH-TS algorithm is developed based on the definition of a sophisticated neighbourhood structure under the mechanism of the Best-Insertion-Heuristic (BIH) algorithm and Tabu Search (TS) metaheuristic algorithm. A case study is performed for optimising a complex real-world coal rail system in Australia. A method to calculate the lower bound of the makespan is proposed to evaluate results. The results indicate that the proposed methodology is promising to find the optimal or near-optimal feasible train timetables of a coal rail system under network and terminal capacity constraints.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The convergence of Internet marketplaces and service-oriented architectures has spurred the growth of Web service ecosystems. This paper articulates a vision for Web service ecosystems, discusses early manifestations of this vision, and presents a unifying architecture to support the emergence of larger and more sophisticated ecosystems

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design of pre-contoured fracture fixation implants (plates and nails) that correctly fit the anatomy of a patient utilises 3D models of long bones with accurate geometric representation. 3D data is usually available from computed tomography (CT) scans of human cadavers that generally represent the above 60 year old age group. Thus, despite the fact that half of the seriously injured population comes from the 30 year age group and below, virtually no data exists from these younger age groups to inform the design of implants that optimally fit patients from these groups. Hence, relevant bone data from these age groups is required. The current gold standard for acquiring such data–CT–involves ionising radiation and cannot be used to scan healthy human volunteers. Magnetic resonance imaging (MRI) has been shown to be a potential alternative in the previous studies conducted using small bones (tarsal bones) and parts of the long bones. However, in order to use MRI effectively for 3D reconstruction of human long bones, further validations using long bones and appropriate reference standards are required. Accurate reconstruction of 3D models from CT or MRI data sets requires an accurate image segmentation method. Currently available sophisticated segmentation methods involve complex programming and mathematics that researchers are not trained to perform. Therefore, an accurate but relatively simple segmentation method is required for segmentation of CT and MRI data. Furthermore, some of the limitations of 1.5T MRI such as very long scanning times and poor contrast in articular regions can potentially be reduced by using higher field 3T MRI imaging. However, a quantification of the signal to noise ratio (SNR) gain at the bone - soft tissue interface should be performed; this is not reported in the literature. As MRI scanning of long bones has very long scanning times, the acquired images are more prone to motion artefacts due to random movements of the subject‟s limbs. One of the artefacts observed is the step artefact that is believed to occur from the random movements of the volunteer during a scan. This needs to be corrected before the models can be used for implant design. As the first aim, this study investigated two segmentation methods: intensity thresholding and Canny edge detection as accurate but simple segmentation methods for segmentation of MRI and CT data. The second aim was to investigate the usability of MRI as a radiation free imaging alternative to CT for reconstruction of 3D models of long bones. The third aim was to use 3T MRI to improve the poor contrast in articular regions and long scanning times of current MRI. The fourth and final aim was to minimise the step artefact using 3D modelling techniques. The segmentation methods were investigated using CT scans of five ovine femora. The single level thresholding was performed using a visually selected threshold level to segment the complete femur. For multilevel thresholding, multiple threshold levels calculated from the threshold selection method were used for the proximal, diaphyseal and distal regions of the femur. Canny edge detection was used by delineating the outer and inner contour of 2D images and then combining them to generate the 3D model. Models generated from these methods were compared to the reference standard generated using the mechanical contact scans of the denuded bone. The second aim was achieved using CT and MRI scans of five ovine femora and segmenting them using the multilevel threshold method. A surface geometric comparison was conducted between CT based, MRI based and reference models. To quantitatively compare the 1.5T images to the 3T MRI images, the right lower limbs of five healthy volunteers were scanned using scanners from the same manufacturer. The images obtained using the identical protocols were compared by means of SNR and contrast to noise ratio (CNR) of muscle, bone marrow and bone. In order to correct the step artefact in the final 3D models, the step was simulated in five ovine femora scanned with a 3T MRI scanner. The step was corrected using the iterative closest point (ICP) algorithm based aligning method. The present study demonstrated that the multi-threshold approach in combination with the threshold selection method can generate 3D models from long bones with an average deviation of 0.18 mm. The same was 0.24 mm of the single threshold method. There was a significant statistical difference between the accuracy of models generated by the two methods. In comparison, the Canny edge detection method generated average deviation of 0.20 mm. MRI based models exhibited 0.23 mm average deviation in comparison to the 0.18 mm average deviation of CT based models. The differences were not statistically significant. 3T MRI improved the contrast in the bone–muscle interfaces of most anatomical regions of femora and tibiae, potentially improving the inaccuracies conferred by poor contrast of the articular regions. Using the robust ICP algorithm to align the 3D surfaces, the step artefact that occurred by the volunteer moving the leg was corrected, generating errors of 0.32 ± 0.02 mm when compared with the reference standard. The study concludes that magnetic resonance imaging, together with simple multilevel thresholding segmentation, is able to produce 3D models of long bones with accurate geometric representations. The method is, therefore, a potential alternative to the current gold standard CT imaging.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Each financial year concessions, benefits and incentives are delivered to taxpayers via the tax system. These concessions, benefits and incentives, referred to as tax expenditure, differ from direct expenditure because of the recurring fiscal impact without regular scrutiny through the federal budget process. There are approximately 270 different tax expenditures existing within the current tax regime with total measured tax expenditures in the 2005-06 financial year estimated to be around $42.1 billion, increasing to $52.7 billion by 2009-10. Each year, new tax expenditures are introduced, while existing tax expenditures are modified and deleted. In recognition of some of the problems associated with tax expenditure, a Tax Expenditure Statement, as required by the Charter of Budget Honesty Act 1988, is produced annually by the Australian Federal Treasury. The Statement details the various expenditures and measures in the form of concessions, benefits and incentives provided to taxpayers by the Australian Government and calculates the tax expenditure in terms of revenue forgone. A similar approach to reporting tax expenditure, with such a report being a legal requirement, is followed by most OECD countries. The current Tax Expenditure Statement lists 270 tax expenditures and where it is able to, reports on the estimated pecuniary value of those expenditures. Apart from the annual Tax Expenditure Statement, there is very little other scrutiny of Australia’s Federal tax expenditure program. While there has been various academic analysis of tax expenditure in Australia, when compared to the North American literature, it is suggested that the Australian literature is still in its infancy. In fact, one academic author who has contributed to tax expenditure analysis recently noted that there is ‘remarkably little secondary literature which deals at any length with tax expenditures in the Australian context.’ Given this perceived gap in the secondary literature, this paper examines fundamental concept of tax expenditure and considers the role it plays in to the current tax regime as a whole, along with the effects of the introduction of new tax expenditures. In doing so, tax expenditure is contrasted with direct expenditure. An analysis of tax expenditure versus direct expenditure is already a sophisticated and comprehensive body of work stemming from the US over the last three decades. As such, the title of this paper is rather misleading. However, given the lack of analysis in Australia, it is appropriate that this paper undertakes a consideration of tax expenditure versus direct expenditure in an Australian context. Given this proposition, rather than purport to undertake a comprehensive analysis of tax expenditure which has already been done, this paper discusses the substantive considerations of any such analysis to enable further investigation into the tax expenditure regime both as a whole and into individual tax expenditure initiatives. While none of the propositions in this paper are new in a ‘tax expenditure analysis’ sense, this debate is a relatively new contribution to the Australian literature on the tax policy. Before the issues relating to tax expenditure can be determined, it is necessary to consider what is meant by ‘tax expenditure’. As such, part two if this paper defines ‘tax expenditure’. Part three determines the framework in which tax expenditure can be analysed. It is suggested that an analysis of tax expenditure must be evaluated within the framework of the design criteria of an income tax system with the key features of equity, efficiency, and simplicity. Tax expenditure analysis can then be applied to deviations from the ideal tax base. Once it is established what is meant by tax expenditure and the framework for evaluation is determined, it is possible to establish the substantive issues to be evaluated. This paper suggests that there are four broad areas worthy of investigation; economic efficiency, administrative efficiency, whether tax expenditure initiatives achieve their policy intent, and the impact on stakeholders. Given these areas of investigation, part four of this paper considers the issues relating to the economic efficiency of the tax expenditure regime, in particular, the effect on resource allocation, incentives for taxpayer behaviour and distortions created by tax expenditures. Part five examines the notion of administrative efficiency in light of the fact that most tax expenditures could simply be delivered as direct expenditures. Part six explores the notion of policy intent and considers the two questions that need to be asked; whether any tax expenditure initiative reaches its target group and whether the financial incentives are appropriate. Part seven examines the impact on stakeholders. Finally, part eight considers the future of tax expenditure analysis in Australia.