902 resultados para reachable sets
Resumo:
The complex transition from convict to free labour influenced state intervention in the employment relationship, and initiated the first minimum labour standards in Australia in 1828. Since then, two principal sets of tensions have affected the enforcement of such standards: tensions between government and employers, and tensions between the major political parties over industrial and economic issues. This article argues that these tensions have resulted in a sustained legacy affecting minimum labour standards’ enforcement in Australia. The article outlines broad historical developments and contexts of minimum labour standards’ enforcement in Australia since 1828, with more contemporary exploration focusing specifically on enforcement practices and policies in the Australian federal industrial relations jurisdiction. Current enforcement practices are an outcome of this volatile history, and past influences remain strong.
Resumo:
The briefly resurrected Marxism Today (1998), edited by Martin Jacques, sets out to deal with perceived failures of the 'Blair project' (Jacques, 1998: 2). Jacques opens the issue by reaffirming that Blair, which is to say New Labour, is the successful creation of the 'New Left' projects, the first of which began in the late-fifties and early sixties in both Britain and the US, and which were vigorously revived in the late 1980s. However, the most comprehensive debate is fairly much contained in the first three articles, written by Hobsbawm, Hall, and Mulgan, insofar as the broadest defining parameters of Third Way 'values' are addressed by these writers.
Resumo:
Log-linear and maximum-margin models are two commonly-used methods in supervised machine learning, and are frequently used in structured prediction problems. Efficient learning of parameters in these models is therefore an important problem, and becomes a key factor when learning from very large data sets. This paper describes exponentiated gradient (EG) algorithms for training such models, where EG updates are applied to the convex dual of either the log-linear or max-margin objective function; the dual in both the log-linear and max-margin cases corresponds to minimizing a convex function with simplex constraints. We study both batch and online variants of the algorithm, and provide rates of convergence for both cases. In the max-margin case, O(1/ε) EG updates are required to reach a given accuracy ε in the dual; in contrast, for log-linear models only O(log(1/ε)) updates are required. For both the max-margin and log-linear cases, our bounds suggest that the online EG algorithm requires a factor of n less computation to reach a desired accuracy than the batch EG algorithm, where n is the number of training examples. Our experiments confirm that the online algorithms are much faster than the batch algorithms in practice. We describe how the EG updates factor in a convenient way for structured prediction problems, allowing the algorithms to be efficiently applied to problems such as sequence learning or natural language parsing. We perform extensive evaluation of the algorithms, comparing them to L-BFGS and stochastic gradient descent for log-linear models, and to SVM-Struct for max-margin models. The algorithms are applied to a multi-class problem as well as to a more complex large-scale parsing task. In all these settings, the EG algorithms presented here outperform the other methods.
Resumo:
Online learning algorithms have recently risen to prominence due to their strong theoretical guarantees and an increasing number of practical applications for large-scale data analysis problems. In this paper, we analyze a class of online learning algorithms based on fixed potentials and nonlinearized losses, which yields algorithms with implicit update rules. We show how to efficiently compute these updates, and we prove regret bounds for the algorithms. We apply our formulation to several special cases where our approach has benefits over existing online learning methods. In particular, we provide improved algorithms and bounds for the online metric learning problem, and show improved robustness for online linear prediction problems. Results over a variety of data sets demonstrate the advantages of our framework.
Resumo:
A letter in response to an article by David Rojas-Rueda, Audrey de Nazelle, Marko Tainio, Mark J Nieuwenhuijsen, The health risks and benefits of cycling in urban environments compared with car use: health impact assessment study. BMJ 2011;343:doi:10.1136/bmj.d4521 (Published 4 August 2011) This paper sets out to compare the health benefits of the Bicing scheme (Barcelona's public bicycle share scheme) with possible risks associated with increased bicycle riding. The key variables used by the researchers include physical activity, exposure to air pollution and road traffic injury. The authors rightly identify that although traffic congestion is often a major motivator behind the establishment of public bicycle share schemes (PBSS), the health benefits may well be the largest single benefit of such schemes. Certainly PBSS appear to be one of the most effective methods of increasing the number of bicycle trips across a population, providing additional transport options and improving awareness of the possibilities bicycles offer urban transport systems. Overall, the paper is a useful addition to the literature, in that it has attempted to assess the health benefits of a large scale PBSS and weighed these against potential risks related to cyclists exposure to air pollution and road traffic injuries. Unfortunately a fundamentally flawed assumption related to the proportion of Bicing trips replacing car journeys invalidates the results of this paper. A future paper with up to date data would create a significant contribution to this emerging area within the field of sustainable transport.
Resumo:
The adoption of IT Governance (ITG) continues to be an important topic for research. Many researchers have focused their attention on how these practices are currently being implemented in the many diverse areas and industries. Literature shows that a majority of these studies have only been based on industries and organizations in developed countries. There exist very few researches that look specifically within the context of a developing country. Furthermore, there seems to be a lack of research on identifying the barriers or inhibitors to IT Governance adoption within the context of an emerging yet still developing Asian country. This research sets out to justify, substantiate and improve on a priori model developed to study the barriers to the adoption of ITG practice using qualitative data obtained through a series of semi-structured interviews conducted on organizations in Malaysia.
Resumo:
Uncontrolled fibroblast growth factor (FGF) signaling can lead to human diseases, necessitating multiple layers of self-regulatory control mechanisms to keep its activity in check. Herein, we demonstrate that FGF9 and FGF20 ligands undergo a reversible homodimerization, occluding their key receptor binding sites. To test the role of dimerization in ligand autoinhibition, we introduced structure-based mutations into the dimer interfaces of FGF9 and FGF20. The mutations weakened the ability of the ligands to dimerize, effectively increasing the concentrations of monomeric ligands capable of binding and activating their cognate FGF receptor in vitro and in living cells. Interestingly, the monomeric ligands exhibit reduced heparin binding, resulting in their increased radii of heparan sulfate-dependent diffusion and biologic action, as evidenced by the wider dilation area of ex vivo lung cultures in response to implanted mutant FGF9-loaded beads. Hence, our data demonstrate that homodimerization autoregulates FGF9 and FGF20's receptor binding and concentration gradients in the extracellular matrix. Our study is the first to implicate ligand dimerization as an autoregulatory mechanism for growth factor bioactivity and sets the stage for engineering modified FGF9 subfamily ligands, with desired activity for use in both basic and translational research.
Resumo:
This paper presents a method of spatial sampling based on stratification by Local Moran’s I i calculated using auxiliary information. The sampling technique is compared to other design-based approaches including simple random sampling, systematic sampling on a regular grid, conditional Latin Hypercube sampling and stratified sampling based on auxiliary information, and is illustrated using two different spatial data sets. Each of the samples for the two data sets is interpolated using regression kriging to form a geostatistical map for their respective areas. The proposed technique is shown to be competitive in reproducing specific areas of interest with high accuracy.
Resumo:
Project management in the construction industry involves coordination of many tasks and individuals, affected by complexity and uncertainty, which increases the need for efficient cooperation. Procurement is crucial since it sets the basis for cooperation between clients and contractors. This is true whether the project is local, regional or global in scope. Traditionally, procurement procedures are competitive, resulting in conflicts, adversarial relationships and less desirable project results. The purpose of this paper is to propose and empirically test an alternative procurement model based on cooperative procurement procedures that facilitates cooperation between clients and contractors in construction projects. The model is based on four multi-item constructs – incentive-based compensation, limited bidding options, partner selection and cooperation. Based on a sample of 87 client organisations, the model was empirically tested and exhibited strong support, including content, nomological, convergent and discriminant validity, as well as reliability. Our findings indicate that partner selection based on task related attributes mediates the relationship between two important pre-selection processes (incentive-based compensation and limited bid invitation) and preferred outcome of cooperation. The contribution of the paper is identifying valid and reliable measurement constructs and confirming a unique sequential order for achieving cooperation. Moreover, the findings are applicable for many types of construction projects because of the similarities in the construction industry worldwide.
Resumo:
Skeletal muscle from strength- and endurance-trained individuals represents diverse adaptive states. In this regard, AMPK-PGC-1α signaling mediates several adaptations to endurance training, while up-regulation of the Akt-TSC2-mTOR pathway may underlie increased protein synthesis after resistance exercise. We determined the effect of prior training history on signaling responses in seven strength-trained and six endurance-trained males who undertook 1 h cycling at 70% VO2peak or eight sets of five maximal repetitions of isokinetic leg extensions. Muscle biopsies were taken at rest, immediately and 3 h postexercise. AMPK phosphorylation increased after cycling in strength-trained (54%; P<0.05) but not endurance-trained subjects. Conversely, AMPK was elevated after resistance exercise in endurance- (114%; P<0.05), but not strengthtrained subjects. Akt phosphorylation increased in endurance- (50%; P<0.05), but not strengthtrained subjects after cycling but was unchanged in either group after resistance exercise. TSC2 phosphorylation was decreased (47%; P<0.05) in endurance-trained subjects following resistance exercise, but cycling had little effect on the phosphorylation state of this protein in either group. p70S6K phosphorylation increased in endurance- (118%; P<0.05), but not strength-trained subjects after resistance exercise, but was similar to rest in both groups after cycling. Similarly, phosphorylation of S6 protein, a substrate for p70 S6K, was increased immediately following resistance exercise in endurance- (129%; P<0.05), but not strength-trained subjects. In conclusion, a degree of “response plasticity” is conserved at opposite ends of the endurancehypertrophic adaptation continuum. Moreover, prior training attenuates the exercise specific signaling responses involved in single mode adaptations to training.
Resumo:
Based on the molecular dynamics simulation, plastic deformation mechanisms associated with the zigzag stress curves in perfect and surface defected copper nanowires under uniaxial tension are studied. In our previous study, it has found that the surface defect exerts larger influence than the centro-plane defect, and the 45o surface defect appears as the most influential surface defect. Hence, in this paper, the nanowire with a 45o surface defect is chosen to investigate the defect’s effect to the plastic deformation mechanism of nanowires. We find that during the plastic deformation of both perfect and defected nanowires, decrease regions of the stress curve are accompanied with stacking faults generation and migration activities, but during stress increase, the structure of the nanowire appears almost unchanged. We also observe that surface defects have obvious influence on the nanowire’s plastic deformation mechanisms. In particular, only two sets of slip planes are found to be active and twins are also observed in the defected nanowire.
Resumo:
In this paper, we propose a search-based approach to join two tables in the absence of clean join attributes. Non-structured documents from the web are used to express the correlations between a given query and a reference list. To implement this approach, a major challenge we meet is how to efficiently determine the number of times and the locations of each clean reference from the reference list that is approximately mentioned in the retrieved documents. We formalize the Approximate Membership Localization (AML) problem and propose an efficient partial pruning algorithm to solve it. A study using real-word data sets demonstrates the effectiveness of our search-based approach, and the efficiency of our AML algorithm.
Resumo:
While in many travel situations there is an almost limitless range of available destinations, travellers will usually only actively consider two to six in their decision set. One of the greatest challenges facing destination marketers is positioning their destination, against the myriad of competing places that offer similar features, into consumer decision sets. Since positioning requires a narrow focus, marketing communications must present a succinct and meaningful proposition, the selection of which is often problematic for destination marketing organisations (DMO), which deal with a diverse and often eclectic range of attributes in addition to self-interested and demanding stakeholders who have interests in different market segments. This paper reports the application of two qualitative techniques used to explore the range of cognitive attributes, consequences and personal values that represent potential positioning opportunities in the context of short break holidays. The Repertory Test is an effective technique for understanding the salient attributes used by a traveller to differentiate destinations, and Laddering Analysis enables the researcher to explore the smaller set of consequences and personal values guiding such decision making. A key finding of the research was that while individuals might vary in their repertoire of salient attributes, there was a commonality of shared consequences and values. This has important implications for DMOs, since a brand positioning theme that is based on a value will subsume multiple and diverse attributes. It is posited that such a theme will appeal to a broader range of travellers, as well as appease a greater number of destination stakeholders, than would an attribute based theme.
Resumo:
Computational journalism involves the application of software and technologies to the activities of journalism, and it draws from the fields of computer science, the social sciences, and media and communications. New technologies may enhance the traditional aims of journalism, or may initiate greater interaction between journalists and information and communication technology (ICT) specialists. The enhanced use of computing in news production is related in particular to three factors: larger government data sets becoming more widely available; the increasingly sophisticated and ubiquitous nature of software; and the developing digital economy. Drawing upon international examples, this paper argues that computational journalism techniques may provide new foundations for original investigative journalism and increase the scope for new forms of interaction with readers. Computer journalism provides a major opportunity to enhance the delivery of original investigative journalism, and to attract and retain readers online.