861 resultados para Support vectors machine


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Non-state insurgent actors are too weak to compel powerful adversaries to their will, so they use violence to coerce. A principal objective is to grow and sustain violent resistance to the point that it either militarily challenges the state, or more commonly, generates unacceptable political costs. To survive, insurgents must shift popular support away from the state and to grow they must secure it. State actor policies and actions perceived as illegitimate and oppressive by the insurgent constituency can generate these shifts. A promising insurgent strategy is to attack states in ways that lead angry publics and leaders to discount the historically established risks and take flawed but popular decisions to use repressive measures. Such decisions may be enabled by a visceral belief in the power of coercion and selective use of examples of where robust measures have indeed suppressed resistance. To avoid such counterproductive behaviours the cases of apparent 'successful repression' must be understood. This thesis tests whether robust state action is correlated with reduced support for insurgents, analyses the causal mechanisms of such shifts and examines whether such reduction is because of compulsion or coercion? The approach is founded on prior research by the RAND Corporation which analysed the 30 insurgencies most recently resolved worldwide to determine factors of counterinsurgent success. This new study first re-analyses their data at a finer resolution with new queries that investigate the relationship between repression and insurgent active support. Having determined that, in general, repression does not correlate with decreased insurgent support, this study then analyses two cases in which the data suggests repression seems likely to be reducing insurgent support: the PKK in Turkey and the insurgency against the Vietnamese-sponsored regime after their ousting of the Khmer Rouge. It applies 'structured-focused' case analysis with questions partly built from the insurgency model of Leites and Wolf, who are associated with the advocacy of US robust means in Vietnam. This is thus a test of 'most difficult' cases using a 'least likely' test model. Nevertheless, the findings refute the deterrence argument of 'iron fist' advocates. Robust approaches may physically prevent effective support of insurgents but they do not coercively deter people from being willing to actively support the insurgency.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Facebook is approaching ubiquity in the social habits and practice of many students. However, its use in higher education has been criticised (Maranto & Barton, 2010) because it can remove or blur academic boundaries. Despite these concerns, there is strong potential to use Facebook to support new students to communicate and interact with each other (Cheung, Chiu, & Lee, 2010). This paper shows how Facebook can be used by teaching staff to communicate more effectively with students. Further, it shows how it can provide a way to represent and include beginning students’ thoughts, opinions and feedback as an element of the learning design and responsive feed-forward into lectures and tutorial activities. We demonstrate how an embedded social media strategy can be used to complement and enhance the first year curriculum experience by functioning as a transition device for student support and activating Kift’s (2009) organising principles for first year curriculum design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Material for this paper comes from as report commissioned by the Department of Family Services, Aboriginal and Islander Affairs. The report is the result of a multi strategy research project designed to assess the impact of gaming machines on the fundraising capacity of charitable and community organisations in Queensland. The study was conducted during the 1993 calendar year. The first Queensland gaming machine was commissioned on the 11 February, 1992 at 11.30 am in Brisbane at the Kedron Wavell Services Club. Eighteen more clubs followed that week. Six months later there were gaming machines in 335 clubs, and 250 hotels and taverns, representing a state wide total of 7,974 machines in operation. The 10,000 gaming machine was commissioned on the 18 March, 1993 and the 1,000 operational gaming machine site was opened on 18th February, 1994.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Server consolidation using virtualization technology has become an important technology to improve the energy efficiency of data centers. Virtual machine placement is the key in the server consolidation. In the past few years, many approaches to the virtual machine placement have been proposed. However, existing virtual machine placement approaches to the virtual machine placement problem consider the energy consumption by physical machines in a data center only, but do not consider the energy consumption in communication network in the data center. However, the energy consumption in the communication network in a data center is not trivial, and therefore should be considered in the virtual machine placement in order to make the data center more energy-efficient. In this paper, we propose a genetic algorithm for a new virtual machine placement problem that considers the energy consumption in both the servers and the communication network in the data center. Experimental results show that the genetic algorithm performs well when tackling test problems of different kinds, and scales up well when the problem size increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developing supportive, authentic and collaborative partnerships between all partners is crucial to inclusive school culture. This chapter highlights understandings of collaboration within such a culture. It also draws attention to what is involved in achieving these relationships, and identifies associated characteristics. In addition, it describes how successful collegial teams can be developed and ways in which teachers can work as collaborative members of these teams for students with disabilities within inclusive educational settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research examined the effects of occupational stress in psychiatric nursing on employee well!being using the full Job Strain Model.The Job Strain Model was assessed for its ability to predict employee well!being in terms of job satisfaction and mental health. The original Job Strain Model was expanded to include social support based on previous research concerning the impact of social support on well!being[ In the present study\ both work support and non-work were assessed for their contribution to wellbeing.The results of this study indicate that the full Job Strain Model can be used to significantly predict job satisfaction and mental health in this sample of Australian psychiatric nurses. Furthermore social support was shown to be an important component of the Job Strain Model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The security of power transfer across a given transmission link is typically a steady state assessment. This paper develops tools to assess machine angle stability as affected by a combination of faults and uncertainty of wind power using probability analysis. The paper elaborates on the development of the theoretical assessment tool and demonstrates its efficacy using single machine infinite bus system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a combined structure for using real, complex, and binary valued vectors for semantic representation. The theory, implementation, and application of this structure are all significant. For the theory underlying quantum interaction, it is important to develop a core set of mathematical operators that describe systems of information, just as core mathematical operators in quantum mechanics are used to describe the behavior of physical systems. The system described in this paper enables us to compare more traditional quantum mechanical models (which use complex state vectors), alongside more generalized quantum models that use real and binary vectors. The implementation of such a system presents fundamental computational challenges. For large and sometimes sparse datasets, the demands on time and space are different for real, complex, and binary vectors. To accommodate these demands, the Semantic Vectors package has been carefully adapted and can now switch between different number types comparatively seamlessly. This paper describes the key abstract operations in our semantic vector models, and describes the implementations for real, complex, and binary vectors. We also discuss some of the key questions that arise in the field of quantum interaction and informatics, explaining how the wide availability of modelling options for different number fields will help to investigate some of these questions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In urban residential environments in Australia and other developed countries, Internet access is on the verge of becoming a ubiquitous utility like gas or electricity. From an urban sociology and community informatics perspective, this article discusses new emerging social formations of urban residents that are based on networked individualism and the potential of Internet-based systems to support them. It proposes that one of the main reasons for the disappearance or nonexistence of urban residential communities is a lack of appropriate opportunities and instruments to encourage and support local interaction in urban neighborhoods. The article challenges the view that a mere reappropriation of applications used to support dispersed virtual communities is adequate to meet the place and proximity-based design requirements that community networks in urban neighborhoods pose. It argues that the key factors influencing the successful design and uptake of interactive systems to support social networks in urban neighborhoods include the swarming social behavior of urban dwellers; the dynamics of their existing communicative ecology; and the serendipitous, voluntary, and place-based quality of interaction between residents on the basis of choice, like-mindedness, mutual interest and support needs. Drawing on an analysis of these factors, the conceptual design framework of a prototype system — the urban tribe incubator — is presented.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Broad, early definitions of sustainable development have caused confusion and hesitation among local authorities and planning professionals. This confusion has arisen because loosely defined principles of sustainable development have been employed when setting policies and planning projects, and when gauging the efficiencies of these policies in the light of designated sustainability goals. The question of how this theory-rhetoric-practice gap can be filled is the main focus of this chapter. It examines the triple bottom line approach–one of the sustainability accounting approaches widely employed by governmental organisations–and the applicability of this approach to sustainable urban development. The chapter introduces the ‘Integrated Land Use and Transportation Indexing Model’ that incorporates triple bottom line considerations with environmental impact assessment techniques via a geographic, information systemsbased decision support system. This model helps decision-makers in selecting policy options according to their economic, environmental and social impacts. Its main purpose is to provide valuable knowledge about the spatial dimensions of sustainable development, and to provide fine detail outputs on the possible impacts of urban development proposals on sustainability levels. In order to embrace sustainable urban development policy considerations, the model is sensitive to the relationship between urban form, travel patterns and socio-economic attributes. Finally, the model is useful in picturing the holistic state of urban settings in terms of their sustainability levels, and in assessing the degree of compatibility of selected scenarios with the desired sustainable urban future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Heart failure (HF) remains a condition with high morbidity and mortality. We tested a telephone support strategy to reduce major events in rural and remote Australians with HF, who have limited healthcare access. Telephone support comprised an interactive telecommunication software tool (TeleWatch) with follow-up by trained cardiac nurses. Methods Patients with a general practice (GP) diagnosis of HF were randomised to usual care (UC) or UC and telephone support intervention (UC+I) using a cluster design involving 143 GPs throughout Australia. Patients were followed for 12 months. The primary end-point was the Packer clinical composite score. Secondary end-points included hospitalisation for any cause, death or hospitalisation, as well as HF hospitalisation. Results Four hundred and five patients were randomised into CHAT. Patients were well matched at baseline for key demographic variables. The primary end-point of the Packer Score was not different between the two groups (P=0.98), although more patients improved with UC+I. There were fewer patients hospitalised for any cause (74 versus 114, adjusted HR 0.67 [95% CI 0.50-0.89], p=0.006) and who died or were hospitalised (89 versus 124, adjusted HR 0.70 [95% CI 0.53 – 0.92], p=0.011), in the UC+I vs UC group. HF hospitalisations were reduced with UC+I (23 versus 35, adjusted HR 0.81 [95% CI 0.44 – 1.38]), although this was not significant (p=0.43). There were 16 deaths in the UC group and 17 in the UC+I group (p=0.43). Conclusions Although no difference was observed in the primary end-point of CHAT (Packer composite score), UC+I significantly reduced the number of HF patients hospitalised amongst a rural and remote cohort. These data suggest that telephone support may be an efficacious approach to improve clinical outcomes in rural and remote HF patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background/objectives This study estimates the economic outcomes of a nutrition intervention to at-risk patients compared with standard care in the prevention of pressure ulcer. Subjects/methods Statistical models were developed to predict ‘cases of pressure ulcer avoided’, ‘number of bed days gained’ and ‘change to economic costs’ in public hospitals in 2002–2003 in Queensland, Australia. Input parameters were specified and appropriate probability distributions fitted for: number of discharges per annum; incidence rate for pressure ulcer; independent effect of pressure ulcer on length of stay; cost of a bed day; change in risk in developing a pressure ulcer associated with nutrition support; annual cost of the provision of a nutrition support intervention for at-risk patients. A total of 1000 random re-samples were made and the results expressed as output probability distributions. Results The model predicts a mean 2896 (s.d. 632) cases of pressure ulcer avoided; 12 397 (s.d. 4491) bed days released and corresponding mean economic cost saving of euros 2 869 526 (s.d. 2 078 715) with a nutrition support intervention, compared with standard care. Conclusion Nutrition intervention is predicted to be a cost-effective approach in the prevention of pressure ulcer in at-risk patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Typical flow fields in a stormwater gross pollutant trap (GPT) with blocked retaining screens were experimentally captured and visualised. Particle image velocimetry (PIV) software was used to capture the flow field data by tracking neutrally buoyant particles with a high speed camera. A technique was developed to apply the Image Based Flow Visualization (IBFV) algorithm to the experimental raw dataset generated by the PIV software. The dataset consisted of scattered 2D point velocity vectors and the IBFV visualisation facilitates flow feature characterisation within the GPT. The flow features played a pivotal role in understanding gross pollutant capture and retention within the GPT. It was found that the IBFV animations revealed otherwise unnoticed flow features and experimental artefacts. For example, a circular tracer marker in the IBFV program visually highlighted streamlines to investigate specific areas and identify the flow features within the GPT.