958 resultados para Applied current
Resumo:
This paper describes observational research and verbal protocols methods, how these methods are applied and integrated within different contexts, and how they complement each other. The first case study focuses on nurses’ interaction during bandaging of patients’ lower legs. To maintain research rigor a triangulation approach was applied that links observations of current procedures, ‘talk-aloud’ protocol during interaction and retrospective protocol. Maps of interactions demonstrated that some nurses bandage more intuitively than others. Nurses who bandage intuitively assemble long sequences of bandaging actions while nurses who bandage less intuitively ‘focus-shift’ in between bandaging actions. Thus different levels of expertise have been identified. The second case study consists of two laboratory experiments. It focuses on analysing and comparing software and product design teams and how they approached a design problem. It is based on the observational and verbal data analysis. The coding scheme applied evolved during the analysis of the activity of each team and is identical for all teams. The structure of knowledge captured from the analysis of the design team maps of interaction is identified. The significance of this work is within its methodological approach. The maps of interaction are instrumental for understanding the activities and interactions of the people observed. By examining the maps of interaction, it is possible to draw conclusions about interactions, structure of knowledge captured and level of expertise. This research approach is transferable to other design domains. Designers will be able to transfer the interaction maps outcomes to systems and services they design.
Resumo:
Clinical information systems have become important tools in contemporary clinical patient care. However, there is a question of whether the current clinical information systems are able to effectively support clinicians in decision making processes. We conducted a survey to identify some of the decision making issues related to the use of existing clinical information systems. The survey was conducted among the end users of the cardiac surgery unit, quality and safety unit, intensive care unit and clinical costing unit at The Prince Charles Hospital (TPCH). Based on the survey results and reviewed literature, it was identified that support from the current information systems for decision-making is limited. Also, survey results showed that the majority of respondents considered lack in data integration to be one of the major issues followed by other issues such as limited access to various databases, lack of time and lack in efficient reporting and analysis tools. Furthermore, respondents pointed out that data quality is an issue and the three major data quality issues being faced are lack of data completeness, lack in consistency and lack in data accuracy. Conclusion: Current clinical information systems support for the decision-making processes in Cardiac Surgery in this institution is limited and this could be addressed by integrating isolated clinical information systems.
Resumo:
Dual-mode vibration of nanowires has been reported experimentally through actuation of the nanowire at its resonance frequency, which is expected to open up a variety of new modalities for the NEMS that could operate in the nonlinear regime. In the present work, we utilize large scale molecular dynamics simulations to investigate the dual-mode vibration of <110> Ag nanowires with triangular, rhombic and truncated rhombic cross-sections. By incorporating the generalized Young-Laplace equation into Euler-Bernoulli beam theory, the influence of surface effects on the dual-mode vibration is studied. Due to the different lattice spacing in principal axes of inertia of the {110} atomic layers, the NW is also modeled as a discrete system to reveal the influence from such specific atomic arrangement. It is found that the <110> Ag NW will under a dual-mode vibration if the actuation direction is deviated from the two principal axes of inertia. The predictions of the two first mode natural frequencies by the classical beam model appear underestimated comparing with the MD results, which are found to be enhanced by the discrete model. Particularly, the predictions by the beam theory with the contribution of surface effects are uniformly larger than the classical beam model, which exhibit better agreement with MD results for larger cross-sectional size. However, for ultrathin NWs, current consideration of surface effects is still experiencing certain inaccuracy. In all, for all different cross-sections, the inclusion of surface effects is found to reduce the difference between the two first mode natural frequencies. This trend is observed consistent with MD results. This study provides a first comprehensive investigation on the dual-mode vibration of <110> oriented Ag NWs, which is supposed to benefit the applications of NWs that acting as a resonating beam.
Resumo:
Authentic assessment tasks enhance engagement, retention and the aspirations of students. This paper explores the discipline-generic features of authentic assessment, which reflect what students need to achieve in the real world. Some assessment tasks are more authentic than others and this paper designs a proposed framework supported by the literature that aids unit co-ordinators to determine the level of authenticity of an assessment task. The framework is applied to three summative assessment tasks, that is, tutorial participation, advocacy exercise and problem-based exam, in a law unit. The level of authenticity of the assessment tasks is compared and opportunities to improve authenticity are identified.
Resumo:
Cotton is one of the most important irrigated crops in subtropical Australia. In recent years, cotton production has been severely affected by the worst drought in recorded history, with the 2007–08 growing season recording the lowest average cotton yield in 30 years. The use of a crop simulation model to simulate the long-term temporal distribution of cotton yields under different levels of irrigation and the marginal value for each unit of water applied is important in determining the economic feasibility of current irrigation practices. The objectives of this study were to: (i) evaluate the CROPGRO-Cotton simulation model for studying crop growth under deficit irrigation scenarios across ten locations in New South Wales (NSW) and Queensland (Qld); (ii) evaluate agronomic and economic responses to water inputs across the ten locations; and (iii) determine the economically optimal irrigation level. The CROPGRO-Cotton simulation model was evaluated using 2 years of experimental data collected at Kingsthorpe, Qld. The model was further evaluated using data from nine locations between northern NSW and southern Qld. Long-term simulations were based on the prevalent furrowirrigation practice of refilling the soil profile when the plant -available soil water content is<50%. The model closely estimated lint yield for all locations evaluated. Our results showed that the amounts of water needed to maximise profit and maximise yield are different, which has economic and environmental implications. Irrigation needed to maximise profits varied with both agronomic and economic factors, which can be quite variable with season and location. Therefore, better tools and information that consider the agronomic and economic implications of irrigation decisions need to be developed and made available to growers.
Resumo:
Purpose – The purpose of this paper is to summarise a successfully defended doctoral thesis. The main purpose of this paper is to provide a summary of the scope, and main issues raised in the thesis so that readers undertaking studies in the same or connected areas may be aware of current contributions to the topic. The secondary aims are to frame the completed thesis in the context of doctoral-level research in project management as well as offer ideas for further investigation which would serve to extend scientific knowledge on the topic. Design/methodology/approach – Research reported in this paper is based on a quantitative study using inferential statistics aimed at better understanding the actual and potential usage of earned value management (EVM) as applied to external projects under contract. Theories uncovered during the literature review were hypothesized and tested using experiential data collected from 145 EVM practitioners with direct experience on one or more external projects under contract that applied the methodology. Findings – The results of this research suggest that EVM is an effective project management methodology. The principles of EVM were shown to be significant positive predictors of project success on contracted efforts and to be a relatively greater positive predictor of project success when using fixed-price versus cost-plus (CP) type contracts. Moreover, EVM's work-breakdown structure (WBS) utility was shown to positively contribute to the formation of project contracts. The contribution was not significantly different between fixed-price and CP contracted projects, with exceptions in the areas of schedule planning and payment planning. EVM's “S” curve benefited the administration of project contracts. The contribution of the S-curve was not significantly different between fixed-price and CP contracted projects. Furthermore, EVM metrics were shown to also be important contributors to the administration of project contracts. The relative contribution of EVM metrics to projects under fixed-price versus CP contracts was not significantly different, with one exception in the area of evaluating and processing payment requests. Practical implications – These results have important implications for project practitioners, EVM advocates, as well as corporate and governmental policy makers. EVM should be considered for all projects – not only for its positive contribution to project contract development and administration, for its contribution to project success as well, regardless of contract type. Contract type should not be the sole determining factor in the decision whether or not to use EVM. More particularly, the more fixed the contracted project cost, the more the principles of EVM explain the success of the project. The use of EVM mechanics should also be used in all projects regardless of contract type. Payment planning using a WBS should be emphasized in fixed-price contracts using EVM in order to help mitigate performance risk. Schedule planning using a WBS should be emphasized in CP contracts using EVM in order to help mitigate financial risk. Similarly, EVM metrics should be emphasized in fixed-price contracts in evaluating and processing payment requests. Originality/value – This paper provides a summary of cutting-edge research work and a link to the published thesis that researchers can use to help them understand how the research methodology was applied as well as how it can be extended.
Resumo:
Traffic safety studies demand more than what current micro-simulation models can provide as they presume that all drivers of motor vehicles exhibit safe behaviours. Several car-following models are used in various micro-simulation models. This research compares the mainstream car following models’ capabilities of emulating precise driver behaviour parameters such as headways and Time to Collisions. The comparison firstly illustrates which model is more robust in the metric reproduction. Secondly, the study conducted a series of sensitivity tests to further explore the behaviour of each model. Based on the outcome of these two steps exploration of the models, a modified structure and parameters adjustment for each car-following model is proposed to simulate more realistic vehicle movements, particularly headways and Time to Collision, below a certain critical threshold. NGSIM vehicle trajectory data is used to evaluate the modified models performance to assess critical safety events within traffic flow. The simulation tests outcomes indicate that the proposed modified models produce better frequency of critical Time to Collision than the generic models, while the improvement on the headway is not significant. The outcome of this paper facilitates traffic safety assessment using microscopic simulation.
Resumo:
Most current computer systems authorise the user at the start of a session and do not detect whether the current user is still the initial authorised user, a substitute user, or an intruder pretending to be a valid user. Therefore, a system that continuously checks the identity of the user throughout the session is necessary without being intrusive to end-user and/or effectively doing this. Such a system is called a continuous authentication system (CAS). Researchers have applied several approaches for CAS and most of these techniques are based on biometrics. These continuous biometric authentication systems (CBAS) are supplied by user traits and characteristics. One of the main types of biometric is keystroke dynamics which has been widely tried and accepted for providing continuous user authentication. Keystroke dynamics is appealing for many reasons. First, it is less obtrusive, since users will be typing on the computer keyboard anyway. Second, it does not require extra hardware. Finally, keystroke dynamics will be available after the authentication step at the start of the computer session. Currently, there is insufficient research in the CBAS with keystroke dynamics field. To date, most of the existing schemes ignore the continuous authentication scenarios which might affect their practicality in different real world applications. Also, the contemporary CBAS with keystroke dynamics approaches use characters sequences as features that are representative of user typing behavior but their selected features criteria do not guarantee features with strong statistical significance which may cause less accurate statistical user-representation. Furthermore, their selected features do not inherently incorporate user typing behavior. Finally, the existing CBAS that are based on keystroke dynamics are typically dependent on pre-defined user-typing models for continuous authentication. This dependency restricts the systems to authenticate only known users whose typing samples are modelled. This research addresses the previous limitations associated with the existing CBAS schemes by developing a generic model to better identify and understand the characteristics and requirements of each type of CBAS and continuous authentication scenario. Also, the research proposes four statistical-based feature selection techniques that have highest statistical significance and encompasses different user typing behaviors which represent user typing patterns effectively. Finally, the research proposes the user-independent threshold approach that is able to authenticate a user accurately without needing any predefined user typing model a-priori. Also, we enhance the technique to detect the impostor or intruder who may take over during the entire computer session.
Resumo:
The practitioner lawyer of the past had little need to reflect on process. The doctrinal research methodology developed intuitively within the common law — a research method at the core of practice. There was no need to justify or classify it within a broader research framework. Modern academic lawyers are facing a different situation. At a time when competition for limited research funds is becoming more intense, and in which interdisciplinary work is highly valued and non-lawyers are involved in the assessment of grant applications, lawyer-applicants who engage in doctrinal research need to be able to explain their methodology more clearly. Doctrinal scholars need to be more open and articulate about their methods. These methods may be different in different contexts. This paper examines the doctrinal method used in legal research and its place in recent research dialogue. Some commentators are of the view that the doctrinal method is simply scholarship rather than a separate research methodology. Richard Posner even suggests that law is ‘not a field with a distinct methodology, but an amalgam of applied logic, rhetoric, economics and familiarity with a specialized vocabulary and a particular body of texts, practices, and institutions ...’.1 Therefore, academic lawyers are beginning to realise that the doctrinal research methodology needs clarification for those outside the legal profession and that a discussion about the standing and place of doctrinal research compared to other methodologies is required.
Resumo:
This article centres on a research project in which freehand drawings provided a richly creative and colourful data source of children’s imagined, ideal learning environments. Issues concerning the analysis of the visual data are discussed, in particular how imaginative content was analysed and how the analytical process was dependent on an accompanying, secondary data source comprising brief, explanatory written texts.
Resumo:
Background Quality of work life (QWL) has been found to influence the commitment of health professionals including nurses. However, reliable information on the QWL and turnover intention of primary health care (PHC) nurses is limited. The aim of this study was to examine the relationship between QWL and turnover intention of PHC nurses in Saudi Arabia. Methods A cross-sectional survey was used in this study. Data were collected using Brooks’ survey of Quality of Nursing Work life (QNWL), the Anticipated Turnover Scale and demographic data questions. A total of 508 PHC nurses in the Jazan region, Saudi Arabia completed the questionnaire (RR = 87%). Descriptive statistics, t-test, ANOVA, General Linear Model (GLM) univariate analysis, standard multiple regression (SMR), and hierarchical multiple regression (HMR) were applied for analysis using SPSS v17 for Windows. Results Findings suggested that the respondents were dissatisfied with their work life, with almost 40% indicating a turnover intention from their current PHC centres. Turnover intention was significantly related to QWL. Using SMR, 26% of the variance in turnover intention was explained by the QWL, p < 0.001, with R² = .263. Further analysis using HMR found that the total variance explained by the model as a whole (demographics and QWL) was 32.1%, p < 0.001. QWL explained an additional 19% of the variance in turnover intention, after controlling for demographic variables. Conclusions Creating and maintaining a healthy work life for PHC nurses is very important to improve their work satisfaction, reduce turnover, enhance productivity and improve nursing care outcomes.
Resumo:
Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.
Resumo:
Food insecurity is the limited access to, or availability of, nutritious, culturally-appropriate and safe foods, or the inability to access these foods by socially acceptable means. In Australia, the monitoring of food insecurity is limited to the use of a single item, included in the three-yearly National Health Survey (NHS). The current research comprised a) a review of the literature and available tools to measure food security, b) piloting and adaptation of the more comprehensive 16-item United States Department of Agriculture (USDA) Food Security Survey Module (FSSM), and c) a cross-sectional study comparing this more comprehensive tool, and it’s 10- and 6- item short forms, with the current single-item used in the NHS, among a sample of households in disadvantaged urban-areas of Brisbane, Australia. Findings have shown that internationally the 16-item USDA-FSSM is the most widely used tool for the measurement of food insecurity. Furthermore, of the validated tools that exist to measure food insecurity, sensitivity and reliability decline as the number of questions in a tool decreases. Among an Australian sample, the current single-measure utilised in the NHS yielded a significantly lower prevalence for food insecurity compared to the 16-item USDA-FSSM and it’s two shorter forms respectively (four and two percentage points lower respectively). These findings suggest that the current prevalence of food insecurity (estimated at 6% in the most recent NHS) may have been underestimated, and have important implications for the development of an effective means of monitoring food security within the context of a developed country.
Resumo:
Objective: Food insecurity may be associated with a number of adverse health and social outcomes however our knowledge of its public health significance in Australia has been limited by use of a single-item measure in the Australian National Health Surveys (NHS) and, more recently, the exclusion of food security items from these surveys. The current study compares prevalence estimates of food insecurity in disadvantaged urban areas of Brisbane using the one-item NHS measure with three adaptations of the United States Department of Agriculture Food Security Survey Module (USDA-FSSM). Design: Data were collected by postal survey (n= 505, 53% response). Food security status was ascertained by the measure used in the NHS, and the 6-, 10- and 18-item versions of the USDA-FSSM. Demographic characteristics of the sample, prevalence estimates of food insecurity and different levels of food insecurity estimated by each tool were determined. Setting: Disadvantaged suburbs of Brisbane city, Australia, 2009. Subjects: Individuals aged ≥ 18 years. Results: Food insecurity was prevalent in socioeconomically-disadvantaged urban areas, estimated as 19.5% using the single-item NHS measure. This was significantly less than the 24.6% (P <0.01), 22.0% (P = 0.01) and 21.3% (P = 0.03) identified using the 18-item, 10-item and 6-item versions of the USDA-FSSM, respectively. The proportion of the sample reporting more severe levels of food insecurity were 10.7%, 10% and 8.6% for the 18-, 10- and 6-item USDA measures respectively, however this degree of food insecurity could not be ascertained using the NHS measure. Conclusions: The measure of food insecurity employed in the NHS may underestimate its prevalence and public health significance. Future monitoring and surveillance efforts should seek to employ a more accurate measure.
Resumo:
Reviews have criticised universities for not embedding sufficient praxis for preparing preservice teachers for the profession. The Teacher Education Done Differently (TEDD) project explored praxis development for preservice teachers within existing university coursework. This mixed-method investigation involved an analysis of multiple case studies with preservice teacher involvement in university programs, namely: Ed Start for practicum I (n=26), III (n=23), and IV (n=12); Move It Use It (Health and Physical Education program; n=38), Studies of Society and its Environment (SOSE, n=24), and Science in Schools (n=38). The project included preservice teachers teaching primary students at the campus site in gifted education (the B-GR8 program, n=22). The percentage range for preservice teacher agreement of their praxis development leading up to practicum I, III, and IV was between 91-100% with a high mean score range (4.26-5.00). Other university units had similar findings except for SOSE (i.e., percentage range: 10-86%; M range: 2.33-4.00; SD range: 0.55-1.32). Qualitative data presented an understanding of the praxis development leading to the conclusion that additional applied learning experiences as lead-up days for field experiences and as avenues for exploring the teaching of specific subject areas presented opportunities for enhancing praxis.