397 resultados para Ambiguity success rate


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shared services is a prominent organizational arrangement for organizations, in particular for support functions. The success (or failure) of shared services is a critical concern as the move to shared services can entail large scale investment and involve fundamental organizational change. The Higher Education (HE) sector is particularly well poised to benefit from shared services as there is a need to improve organizational performance and strong potential from sharing. Through a multiple case study of shared services experiences in HE, this study identifies ten important antecedents of shared services success: (1) Understanding of shared services; (2) Organizational environment; (3) Top management support; (4) IT environment; (5) Governance; (6) Process centric view; (7) Implementation strategy; (8) Project management; (9) Change management; and (10) Communication. The study then develops a preliminary model of shared services success that addresses the interdependencies between the success factors. As the first empirical success model for shared services, it provides valuable guidance to practice and future research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coal Seam Gas (CSG) is a form of natural gas (mainly methane) sorbed in underground coal beds. To mine this gas, wells are drilled directly into an underground coal seam and groundwater (CSG water) is pumped out to the surface. This lowers the downhole piezometric pressure and enables gas desporption from the coal matrix. In the United States, this gas has been extracted commercially since the 1980s. The economic success of US CSG projects has inspired exploration and development in Australia and New Zealand. In Australia, Queensland’s Bowen and Surat basins have been the subject of increased CSG development over the last decade. CSG growth in other Australian basins has not matured to the same level but exploration and development are taking place at an accelerated pace in the Sydney Basin (Illawarra and the Hunter Valley, NSW) and in the Gunnedah Basin. Similarly, CSG exploration in New Zealand has focused in the Waikato region (Maramarua and Huntly), in the West Coast region (Buller, Reefton, and Greymouth), and in Southland (Kaitangata, Mataura, and Ohai). Figure 1 shows a Shcoeller diagram with CSG samples from selected basins in Australia, New Zealand, and the USA. CSG water from all of these basins exhibit the same geochemical signature – low calcium, low magnesium, high bicarbonate, low sulphate and, sometimes, high chloride. This water quality is a direct result of specific biological and geological processes that have taken part in the formation of CSG. In general, these processes include the weathering of rocks (carbonates, dolomite, and halite), cation exchange with clays (responsible for enhanced sodium and depleted calcium and magnesium), and biogenic processes (accounting for the presence of high bicarbonate concentrations). The salinity of CSG waters tends to be brackish (TDS < 30000 mg/l) with a fairly neutral pH. These particular characteristics need to be taken into consideration when assessing water management and disposal alternatives. Environmental issues associated with CSG water disposal have been prominent in developed basins such as the Powder River Basin (PRB) in the United States. When disposed on the land or used for irrigation, water having a high dissolved salts content may reduce water availability to crops thus affecting crop yield. In addition, the high sodium, low calcium and low magnesium concentrations increase the potential to disperse soils and significantly reduce the water infiltration rate. Therefore, CSG waters need to be properly characterised, treated, and disposed to safeguard the environment without compromising other natural resources.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In their paper Lindberg and Ludvigsen (2012) have correctly identified the lack of evidence-based nurse-sensitive indicators measuring the quality of haemodialysis nursing care. The authors suggest that the intradialytic ultrafiltration rate (UFR) (total fluid removed divided by the total time in a single dialysis treatment, measured in litres per hour) may be one such indicator. Importantly it is best practice to minimise high UFRs as they are associated with higher risk of cardiovascular events and vascular access complications (Curatola et al., 2011). However, this does not justify UFR to qualify as a nurse-sensitive indicator of quality in the haemodialysis context. The aim of this response is to voice our concerns over the proposal to use haemodialysis treatment UFR as a haemodialysis nurse-sensitive quality indicator...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A frame-rate stereo vision system, based on non-parametric matching metrics, is described. Traditional metrics, such as normalized cross-correlation, are expensive in terms of logic. Non-parametric measures require only simple, parallelizable, functions such as comparators, counters and exclusive-or, and are thus very well suited to implementation in reprogrammable logic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a growing gap between engineering practice and engineering education that may be contributing to less engineers practicing in industry. Coaching approach to learning and teaching has been proven to be an effective way to develop people in the workplace. A pilot coaching program is offered to Engineering and Technology students in Queensland University of Technology to enable holistic growth in order to better integrate them to the work force and society at large. The results and findings of this program will be published once the program has been completed

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the case of a procurement auction for a single project, in which the breakdown of the winning bid into its component items determines the value of payments subsequently made to bidder as the work progresses. Unbalanced bidding, or bid skewing, involves the uneven distribution of mark-up among the component items in such a way as to attempt to derive increased benefit to the unbalancer but without involving any change in the total bid. One form of unbalanced bidding for example, termed Front Loading (FL), is thought to be widespread in practice. This involves overpricing the work items that occur early in the project and underpricing the work items that occur later in the project in order to enhance the bidder's cash flow. Naturally, auctioners attempt to protect themselves from the effects of unbalancing—typically reserving the right to reject a bid that has been detected as unbalanced. As a result, models have been developed to both unbalance bids and detect unbalanced bids but virtually nothing is known of their use, success or otherwise. This is of particular concern for the detection methods as, without testing, there is no way of knowing the extent to which unbalanced bids are remaining undetected or balanced bids are being falsely detected as unbalanced. This paper reports on a simulation study aimed at demonstrating the likely effects of unbalanced bid detection models in a deterministic environment involving FL unbalancing in a Texas DOT detection setting, in which bids are deemed to be unbalanced if an item exceeds a maximum (or fails to reach a minimum) ‘cut-off’ value determined by the Texas method. A proportion of bids are automatically and maximally unbalanced over a long series of simulated contract projects and the profits and detection rates of both the balancers and unbalancers are compared. The results show that, as expected, the balanced bids are often incorrectly detected as unbalanced, with the rate of (mis)detection increasing with the proportion of FL bidders in the auction. It is also shown that, while the profit for balanced bidders remains the same irrespective of the number of FL bidders involved, the FL bidder's profit increases with the greater proportion of FL bidders present in the auction. Sensitivity tests show the results to be generally robust, with (mis)detection rates increasing further when there are fewer bidders in the auction and when more data are averaged to determine the baseline value, but being smaller or larger with increased cut-off values and increased cost and estimate variability depending on the number of FL bidders involved. The FL bidder's expected benefit from unbalancing, on the other hand, increases, when there are fewer bidders in the auction. It also increases when the cut-off rate and discount rate is increased, when there is less variability in the costs and their estimates, and when less data are used in setting the baseline values.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

China is motorizing rapidly, with associated urban road development and extensive construction of motorways. Speeding accounts for about 10% of fatalities, which represents a large decrease from a peak of 17.2% in 2004. Speeding has been addressed at a national level through the introduction of laws and procedural requirements in 2004, in provinces either across all road types or on motorways, and at city level. Typically, documentation of speed enforcement programmes has taken place when new technology (i.e. speed cameras) is introduced, and it is likely that many programmes have not been documented or widely reported. In particular, the national legislation of 2004 and its implementation was associated with a large reduction in fatalities attributed to speeding. In Guangdong Province, after using speed detection equipment, motorway fatalities due to speeding in 2005 decreased by 32.5% comparing with 2004. In Beijing, the number of traffic monitoring units which were used to photograph illegal traffic activities such as traffic light violations, speeding and using bus lanes illegally increased to 1958 by April 1, 2009, and in the future such automated enforcement will become the main means of enforcement, expected to account for 60% of all traffic enforcement in Beijing. This paper provides a brief overview of the speeding enforcement programmes in China which have been documented and their successes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of Bayesian approaches with the cross likelihood ratio (CLR) as a criterion for speaker clustering within a speaker diarization system, using eigenvoice modeling techniques. The CLR has previously been shown to be an effective decision criterion for speaker clustering using Gaussian mixture models. Recently, eigenvoice modeling has become an increasingly popular technique, due to its ability to adequately represent a speaker based on sparse training data, as well as to provide an improved capture of differences in speaker characteristics. The integration of eigenvoice modeling into the CLR framework to capitalize on the advantage of both techniques has also been shown to be beneficial for the speaker clustering task. Building on that success, this paper proposes the use of Bayesian methods to compute the conditional probabilities in computing the CLR, thus effectively combining the eigenvoice-CLR framework with the advantages of a Bayesian approach to the diarization problem. Results obtained on the 2002 Rich Transcription (RT-02) Evaluation dataset show an improved clustering performance, resulting in a 33.5% relative improvement in the overall Diarization Error Rate (DER) compared to the baseline system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A telephone survey was conducted to describe current practices and policies of patient transport in Australian hospitals.The survey had a 94% response rate. Results showed considerable variability and ambiguity throughout the samplein both practice and policy. Findings also indicated that criteria used for transport practices were predominantlyshaped by physiological and technological considerations. Factors related to human and financial resources, as well aspsychological and emotional aspects of the patient's condition, received little attention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent literature in project management has urged a re-conceptualisation of projects as a value co-creation process. Contrary to the traditional output-focused project methodology, the value creation perspective argues for the importance of creating new knowledge, processes, and systems for suppliers and customers. Stakeholder involvement is important in this new perspective, as the balancing of competing needs of stakeholders in mega projects becomes a major challenge in managing the value co-creation process. In this study we present interview data from three Australian defence mega projects to demonstrate that senior executives have a more complex understanding of project success than traditional iron triangle measures. In these mega defence projects, customers and other stakeholders actively engage in the value creation process, and over time both content and process value are created to increase defence and national capability. Value created and captured during and post projects are the key to true success.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ingredients: - 1 cup Vision - 100ml ‘Real World’ Application - 100ml Unit Structure/Organisation - 100ml Student-centric Approach [optional: Add Social Media/Popular Culture for extra goodness] - Large Dollop of Passion + Enthusiasm - Sprinkle of Approachability Mix all ingredients well. Cover and leave to rise in a Lecture Theatre for 1.5 hours. Cook in a Classroom for 1.5 hours. Garnish with a dash of Humour before serving. Serves 170 Students

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A long-running issue in appetite research concerns the influence of energy expenditure on energy intake. More than 50 years ago, Otto G. Edholm proposed that "the differences between the intakes of food [of individuals] must originate in differences in the expenditure of energy". However, a relationship between energy expenditure and energy intake within any one day could not be found, although there was a correlation over 2 weeks. This issue was never resolved before interest in integrative biology was replaced by molecular biochemistry. Using a psychobiological approach, we have studied appetite control in an energy balance framework using a multi-level experimental system on a single cohort of overweight and obese human subjects. This has disclosed relationships between variables in the domains of body composition [fat-free mass (FFM), fat mass (FM)], metabolism, gastrointestinal hormones, hunger and energy intake. In this Commentary, we review our own and other data, and discuss a new formulation whereby appetite control and energy intake are regulated by energy expenditure. Specifically, we propose that FFM (the largest contributor to resting metabolic rate), but not body mass index or FM, is closely associated with self-determined meal size and daily energy intake. This formulation has implications for understanding weight regulation and the management of obesity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: There are strong logical reasons why energy expended in metabolism should influence the energy acquired in food-intake behavior. However, the relation has never been established, and it is not known why certain people experience hunger in the presence of large amounts of body energy. Objective: We investigated the effect of the resting metabolic rate (RMR) on objective measures of whole-day food intake and hunger. Design: We carried out a 12-wk intervention that involved 41 overweight and obese men and women [mean ± SD age: 43.1 ± 7.5 y; BMI (in kg/m2): 30.7 ± 3.9] who were tested under conditions of physical activity (sedentary or active) and dietary energy density (17 or 10 kJ/g). RMR, daily energy intake, meal size, and hunger were assessed within the same day and across each condition. Results: We obtained evidence that RMR is correlated with meal size and daily energy intake in overweight and obese individuals. Participants with high RMRs showed increased levels of hunger across the day (P < 0.0001) and greater food intake (P < 0.00001) than did individuals with lower RMRs. These effects were independent of sex and food energy density. The change in RMR was also related to energy intake (P < 0.0001). Conclusions: We propose that RMR (largely determined by fat-free mass) may be a marker of energy intake and could represent a physiologic signal for hunger. These results may have implications for additional research possibilities in appetite, energy homeostasis, and obesity. This trial was registered under international standard identification for controlled trials as ISRCTN47291569.