855 resultados para Retaining


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Usability is a multi-dimensional characteristic of a computer system. This paper focuses on usability as a measurement of interaction between the user and the system. The research employs a task-oriented approach to evaluate the usability of a meta search engine. This engine encourages and accepts queries of unlimited size expressed in natural language. A variety of conventional metrics developed by academic and industrial research, including ISO standards,, are applied to the information retrieval process consisting of sequential tasks. Tasks range from formulating (long) queries to interpreting and retaining search results. Results of the evaluation and analysis of the operation log indicate that obtaining advanced search engine results can be accomplished simultaneously with enhancing the usability of the interactive process. In conclusion, we discuss implications for interactive information retrieval system design and directions for future usability research. © 2008 Academy Publisher.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Attracting and retaining a skilled labour force is a critical yet complex issue for rural and remote communities. This article reports the findings of a study investigating the current approaches to recruitment and retention in two separate Australian regions. Building on previously developed models, the research analyses the roles employers and the wider communities are playing, or potentially could play, in addressing issues that influence labour shortages. The findings of the research highlight the complexities of employee attraction and retention and emphasise the need for communities and businesses to work together to overcome labour shortages in rural and remote locations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A simple phenomenological model for the relationship between structure and composition of the high Tc cuprates is presented. The model is based on two simple crystal chemistry principles: unit cell doping and charge balance within unit cells. These principles are inspired by key experimental observations of how the materials accommodate large deviations from stoichiometry. Consistent explanations for significant HTSC properties can be explained without any additional assumptions while retaining valuable insight for geometric interpretation. Combining these two chemical principles with a review of Crystal Field Theory (CFT) or Ligand Field Theory (LFT), it becomes clear that the two oxidation states in the conduction planes (typically d8 and d9) belong to the most strongly divergent d-levels as a function of deformation from regular octahedral coordination. This observation offers a link to a range of coupling effects relating vibrations and spin waves through application of Hund’s rules. An indication of this model’s capacity to predict physical properties for HTSC is provided and will be elaborated in subsequent publications. Simple criteria for the relationship between structure and composition in HTSC systems may guide chemical syntheses within new material systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large amounts of money due to product recalls, consumer impact and subsequent loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and microorganisms to enter the package. In the food processing and packaging industry worldwide, there is an increasing demand for cost effective state of the art inspection technologies that are capable of reliably detecting leaky seals and delivering products at six-sigma. The new technology will develop non-destructive testing technology using digital imaging and sensing combined with a differential vacuum technique to assess seal integrity of food packages on a high-speed production line. The cost of leaky packages in Australian food industries is estimated close to AUD $35 Million per year. Contamination of packaged foods due to micro-organisms entering through air leaks can cause serious public health issues and cost companies large sums of money due to product recalls, compensation claims and loss of market share. The main source of contamination is leaks in packaging which allow air, moisture and micro-organisms to enter the package. Flexible plastic packages are widely used, and are the least expensive form of retaining the quality of the product. These packets can be used to seal, and therefore maximise, the shelf life of both dry and moist products. The seals of food packages need to be airtight so that the food content is not contaminated due to contact with microorganisms that enter as a result of air leakage. Airtight seals also extend the shelf life of packaged foods, and manufacturers attempt to prevent food products with leaky seals being sold to consumers. There are many current NDT (non-destructive testing) methods of testing the seal of flexible packages best suited to random sampling, and for laboratory purposes. The three most commonly used methods are vacuum/pressure decay, bubble test, and helium leak detection. Although these methods can detect very fine leaks, they are limited by their high processing time and are not viable in a production line. Two nondestructive in-line packaging inspection machines are currently available and are discussed in the literature review. The detailed design and development of the High-Speed Sensing and Detection System (HSDS) is the fundamental requirement of this project and the future prototype and production unit. Successful laboratory testing was completed and a methodical design procedure was needed for a successful concept. The Mechanical tests confirmed the vacuum hypothesis and seal integrity with good consistent results. Electrically, the testing also provided solid results to enable the researcher to move the project forward with a certain amount of confidence. The laboratory design testing allowed the researcher to confirm theoretical assumptions before moving into the detailed design phase. Discussion on the development of the alternative concepts in both mechanical and electrical disciplines enables the researcher to make an informed decision. Each major mechanical and electrical component is detailed through the research and design process. The design procedure methodically works through the various major functions both from a mechanical and electrical perspective. It opens up alternative ideas for the major components that although are sometimes not practical in this application, show that the researcher has exhausted all engineering and functionality thoughts. Further concepts were then designed and developed for the entire HSDS unit based on previous practice and theory. In the future, it would be envisaged that both the Prototype and Production version of the HSDS would utilise standard industry available components, manufactured and distributed locally. Future research and testing of the prototype unit could result in a successful trial unit being incorporated in a working food processing production environment. Recommendations and future works are discussed, along with options in other food processing and packaging disciplines, and other areas in the non-food processing industry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The estimation of phylogenetic divergence times from sequence data is an important component of many molecular evolutionary studies. There is now a general appreciation that the procedure of divergence dating is considerably more complex than that initially described in the 1960s by Zuckerkandl and Pauling (1962, 1965). In particular, there has been much critical attention toward the assumption of a global molecular clock, resulting in the development of increasingly sophisticated techniques for inferring divergence times from sequence data. In response to the documentation of widespread departures from clocklike behavior, a variety of local- and relaxed-clock methods have been proposed and implemented. Local-clock methods permit different molecular clocks in different parts of the phylogenetic tree, thereby retaining the advantages of the classical molecular clock while casting off the restrictive assumption of a single, global rate of substitution (Rambaut and Bromham 1998; Yoder and Yang 2000).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cultural tourism and creative industries have intersecting policy agendas and economic interdependencies. Most studies of the creative industries have focused on western countries. Cultural tourism is rarely included. However the arrival of the creative economy and its movement through developing countries has changed the relationship. Supporters of the creative economy now see fit to include tourism. This thesis addresses the development of the creative economy in Malaysia. The thesis conducted case studies on animation and museum sectors in Malaysia. These two case studies provide information on the development of creative economy in Malaysia. The study found that a top-down cultural management approach is being practised but that Malaysia is now influenced by new ideas concerning innovation and technical creativity. The study examined whether or not technical innovation by itself is enough. The reference points here are the Multimedia Super Corridor in Cyberjaya and other similar projects in the region. The museum case study was situated in Malacca. It showed that museums needed to adapt new media and new experiences to remain relevant in today’s world. In applying a case study approach, the thesis made use of interviews with key stakeholders, as well consulting numerous policy documents and web sites. Both case studies imitated similar products and services in the market but added local characteristics. This research project contributes significantly to the existing body of knowledge in the field of creative economy within the context of developing countries. Finally the thesis makes recommendations for Malaysia to better position itself in the regional economy while retaining its distinctive cultural identity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article focuses on the well documented, yet potentially contested concept of rank-and-file policesubculture to conceptualize policeresponse to situations of domesticviolence in Singapore. It argues that the utility of the concept to explaining police behavior is often undermined by an all-powerful, homogenous, and deterministic conception of it that fails to take into account the value of agency in police decision-making and the range of differentiated policeresponse in situations of domesticviolence. Through reviewing the literature on policeresponse to domesticviolence, this study called for the need to rework the concept of policesubculture by treating it as having a relationship with, and response to, the structural conditions of policing, while retaining a conception of the active role played by street-level officers in instituting a situational practice. Using Pierre Bourdieu's relational concepts of ‘habitus’ and ‘field,’ designating the cultural dispositions of policesubculture and structural conditions of policing respectively, the study attempted to reconceptualize the problem of policing domesticviolence with reference to the Singaporean context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Australia's airline industry was born on connecting regional communities to major cities, but almost a century later, many regional and remote communities are facing the prospect of losing their air transport services. The focus of this paper is to highlight key issues and concerns surrounding remote, rural and regional airports in Australia using a network governance framework. Contributions are focused towards regional and remote airport managers, decision makers, and policy makers to stimulate further discussion towards retaining regional and remote services to communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose of this exploratory Australian study was to consider methods of retaining skilled and experienced staff within the domestic violence sector. The antecedents that might influence turnover of practitioners were investigated and analysed. Antecedents broadly included the work-related factors, organisational factors and professional factors. The changing nature of the domestic violence sector was also examined, in particular, feminist identity and feminist practice frameworks. It became evident, however, that the primary reasons for the turnover of study participants can be described as parallel power processes. The concept of parallel power processes as developed through this research aims to capture how workplace behaviours can strongly mirror, or parallel, behaviours used by domestic violence perpetrators. As such, it appears that some domestic violence practitioners are experiencing their own abusive relationship, not within the confines of their home, but within their workplace. Additionally, parallel power processes are compounded by ineffective conflict management processes within the workplace. These concepts directly contribute to practitioners leaving their workplace and, sometimes, the sector. This qualitative study utilised a feminist research epistemology and focused strongly on practitioners' stories. Interviews were undertaken with fifteen domestic violence practitioners from three services within South-East Queensland, Australia. Two sets of semi-structured interviews provided in-depth information based on practitioners‘ experiences of working within this specialised sector. Analysis was conducted using a thematic analytical frame, drawing attention to the key themes as mentioned above. From these findings, it is suggested that in order to retain practitioners, domestic violence services must identify and address parallel power processes through effective conflict management processes. In an operational sense, it is recommended that education and training be undertaken within all staffing levels, in particular management committees. Lastly, it is recommended that the sector itself places greater attention on the re-invigoration of the feminist principles and philosophy that has traditionally guided the sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Acoustic sensors provide an effective means of monitoring biodiversity at large spatial and temporal scales. They can continuously and passively record large volumes of data over extended periods, however these data must be analysed to detect the presence of vocal species. Automated analysis of acoustic data for large numbers of species is complex and can be subject to high levels of false positive and false negative results. Manual analysis by experienced users can produce accurate results, however the time and effort required to process even small volumes of data can make manual analysis prohibitive. Our research examined the use of sampling methods to reduce the cost of analysing large volumes of acoustic sensor data, while retaining high levels of species detection accuracy. Utilising five days of manually analysed acoustic sensor data from four sites, we examined a range of sampling rates and methods including random, stratified and biologically informed. Our findings indicate that randomly selecting 120, one-minute samples from the three hours immediately following dawn provided the most effective sampling method. This method detected, on average 62% of total species after 120 one-minute samples were analysed, compared to 34% of total species from traditional point counts. Our results demonstrate that targeted sampling methods can provide an effective means for analysing large volumes of acoustic sensor data efficiently and accurately.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Formative assessment is increasingly being implemented through policy initiatives in Chinese educational contexts. As an approach to assessment, formative assessment derives many of its key principles from Western contexts, notably through the work of scholars in the UK, the USA and Australia. The question for this paper is the ways that formative assessment has been interpreted in the teaching of College English in Chinese Higher Education. The paper reports on a research study that utilised a sociocultural perspective on learning and assessment to analyse how two Chinese universities – an urban-based Key University and a regional-based Non-Key University – interpreted and enacted a China Ministry of Education policy on formative assessment in College English teaching. Of particular interest for the research were the ways in which the sociocultural conditions of the Chinese context mediated understanding of Western principles and led to their adaptation. The findings from the two universities identified some consistency in localised interpretations of formative assessment which included emphases on process and student participation. The differences related to the specific sociocultural conditions contextualising each university including geographical location, socioeconomic status, and teacher and student roles, expectations and beliefs about English. The findings illustrate the sociocultural tensions in interpreting, adapting and enacting formative assessment in Chinese College English classes and the consequent challenges to and questions about retaining the spirit of formative assessment as it was originally conceptualised.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To investigate the effects of adopting a pull system in assembly lines in contrast to a push system, simulation software called “ARENA” is used as a tool in order to present numerical results from both systems. Simulation scenarios are created to evaluate the effects of attributes changing in assembly systems, with influential factors including the change of manufacturing system (push system to pull system) and variation of demand. Moreover, pull system manufacturing consists of the addition attribute, which is the number of buffer storage. This paper will provide an analysis based on a previous case study, hence process time and workflow refer to the journal name “Optimising and simulating the assembly line balancing problem in a motorcycle manufacturing company: a case study” [2]. The implementation of the pull system mechanism is to produce a system improvement in terms of the number of Work-In-Process (WIP), total time of products in the system, and the number of finished product inventory, while retaining the same throughput.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the context of ambiguity resolution (AR) of Global Navigation Satellite Systems (GNSS), decorrelation among entries of an ambiguity vector, integer ambiguity search and ambiguity validations are three standard procedures for solving integer least-squares problems. This paper contributes to AR issues from three aspects. Firstly, the orthogonality defect is introduced as a new measure of the performance of ambiguity decorrelation methods, and compared with the decorrelation number and with the condition number which are currently used as the judging criterion to measure the correlation of ambiguity variance-covariance matrix. Numerically, the orthogonality defect demonstrates slightly better performance as a measure of the correlation between decorrelation impact and computational efficiency than the condition number measure. Secondly, the paper examines the relationship of the decorrelation number, the condition number, the orthogonality defect and the size of the ambiguity search space with the ambiguity search candidates and search nodes. The size of the ambiguity search space can be properly estimated if the ambiguity matrix is decorrelated well, which is shown to be a significant parameter in the ambiguity search progress. Thirdly, a new ambiguity resolution scheme is proposed to improve ambiguity search efficiency through the control of the size of the ambiguity search space. The new AR scheme combines the LAMBDA search and validation procedures together, which results in a much smaller size of the search space and higher computational efficiency while retaining the same AR validation outcomes. In fact, the new scheme can deal with the case there are only one candidate, while the existing search methods require at least two candidates. If there are more than one candidate, the new scheme turns to the usual ratio-test procedure. Experimental results indicate that this combined method can indeed improve ambiguity search efficiency for both the single constellation and dual constellations respectively, showing the potential for processing high dimension integer parameters in multi-GNSS environment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: HIV-1 Pr55gag virus-like particles (VLPs) expressed by baculovirus in insect cells are considered to be a very promising HIV-1 vaccine candidate, as they have been shown to elicit broad cellular immune responses when tested in animals, particularly when used as a boost to DNA or BCG vaccines. However, it is important for the VLPs to retain their structure for them to be fully functional and effective. The medium in which the VLPs are formulated and the temperature at which they are stored are two important factors affecting their stability. FINDINGS We describe the screening of 3 different readily available formulation media (sorbitol, sucrose and trehalose) for their ability to stabilise HIV-1 Pr55gag VLPs during prolonged storage. Transmission electron microscopy (TEM) was done on VLPs stored at two different concentrations of the media at three different temperatures (4[degree sign]C, --20[degree sign]C and -70[degree sign]C) over different time periods, and the appearance of the VLPs was compared. VLPs stored in 15% trehalose at -70[degree sign]C retained their original appearance the most effectively over a period of 12 months. VLPs stored in 5% trehalose, sorbitol or sucrose were not all intact even after 1 month storage at the temperatures tested. In addition, we showed that VLPs stored under these conditions were able to be frozen and re-thawed twice before showing changes in their appearance. Conclusions Although the inclusion of other analytical tools are essential to validate these preliminary findings, storage in 15% trehalose at -70[degree sign]C for 12 months is most effective in retaining VLP stability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exponential growth of genomic data in the last two decades has made manual analyses impractical for all but trial studies. As genomic analyses have become more sophisticated, and move toward comparisons across large datasets, computational approaches have become essential. One of the most important biological questions is to understand the mechanisms underlying gene regulation. Genetic regulation is commonly investigated and modelled through the use of transcriptional regulatory network (TRN) structures. These model the regulatory interactions between two key components: transcription factors (TFs) and the target genes (TGs) they regulate. Transcriptional regulatory networks have proven to be invaluable scientific tools in Bioinformatics. When used in conjunction with comparative genomics, they have provided substantial insights into the evolution of regulatory interactions. Current approaches to regulatory network inference, however, omit two additional key entities: promoters and transcription factor binding sites (TFBSs). In this study, we attempted to explore the relationships among these regulatory components in bacteria. Our primary goal was to identify relationships that can assist in reducing the high false positive rates associated with transcription factor binding site predictions and thereupon enhance the reliability of the inferred transcription regulatory networks. In our preliminary exploration of relationships between the key regulatory components in Escherichia coli transcription, we discovered a number of potentially useful features. The combination of location score and sequence dissimilarity scores increased de novo binding site prediction accuracy by 13.6%. Another important observation made was with regards to the relationship between transcription factors grouped by their regulatory role and corresponding promoter strength. Our study of E.coli ��70 promoters, found support at the 0.1 significance level for our hypothesis | that weak promoters are preferentially associated with activator binding sites to enhance gene expression, whilst strong promoters have more repressor binding sites to repress or inhibit gene transcription. Although the observations were specific to �70, they nevertheless strongly encourage additional investigations when more experimentally confirmed data are available. In our preliminary exploration of relationships between the key regulatory components in E.coli transcription, we discovered a number of potentially useful features { some of which proved successful in reducing the number of false positives when applied to re-evaluate binding site predictions. Of chief interest was the relationship observed between promoter strength and TFs with respect to their regulatory role. Based on the common assumption, where promoter homology positively correlates with transcription rate, we hypothesised that weak promoters would have more transcription factors that enhance gene expression, whilst strong promoters would have more repressor binding sites. The t-tests assessed for E.coli �70 promoters returned a p-value of 0.072, which at 0.1 significance level suggested support for our (alternative) hypothesis; albeit this trend may only be present for promoters where corresponding TFBSs are either all repressors or all activators. Nevertheless, such suggestive results strongly encourage additional investigations when more experimentally confirmed data will become available. Much of the remainder of the thesis concerns a machine learning study of binding site prediction, using the SVM and kernel methods, principally the spectrum kernel. Spectrum kernels have been successfully applied in previous studies of protein classification [91, 92], as well as the related problem of promoter predictions [59], and we have here successfully applied the technique to refining TFBS predictions. The advantages provided by the SVM classifier were best seen in `moderately'-conserved transcription factor binding sites as represented by our E.coli CRP case study. Inclusion of additional position feature attributes further increased accuracy by 9.1% but more notable was the considerable decrease in false positive rate from 0.8 to 0.5 while retaining 0.9 sensitivity. Improved prediction of transcription factor binding sites is in turn extremely valuable in improving inference of regulatory relationships, a problem notoriously prone to false positive predictions. Here, the number of false regulatory interactions inferred using the conventional two-component model was substantially reduced when we integrated de novo transcription factor binding site predictions as an additional criterion for acceptance in a case study of inference in the Fur regulon. This initial work was extended to a comparative study of the iron regulatory system across 20 Yersinia strains. This work revealed interesting, strain-specific difierences, especially between pathogenic and non-pathogenic strains. Such difierences were made clear through interactive visualisations using the TRNDifi software developed as part of this work, and would have remained undetected using conventional methods. This approach led to the nomination of the Yfe iron-uptake system as a candidate for further wet-lab experimentation due to its potential active functionality in non-pathogens and its known participation in full virulence of the bubonic plague strain. Building on this work, we introduced novel structures we have labelled as `regulatory trees', inspired by the phylogenetic tree concept. Instead of using gene or protein sequence similarity, the regulatory trees were constructed based on the number of similar regulatory interactions. While the common phylogentic trees convey information regarding changes in gene repertoire, which we might regard being analogous to `hardware', the regulatory tree informs us of the changes in regulatory circuitry, in some respects analogous to `software'. In this context, we explored the `pan-regulatory network' for the Fur system, the entire set of regulatory interactions found for the Fur transcription factor across a group of genomes. In the pan-regulatory network, emphasis is placed on how the regulatory network for each target genome is inferred from multiple sources instead of a single source, as is the common approach. The benefit of using multiple reference networks, is a more comprehensive survey of the relationships, and increased confidence in the regulatory interactions predicted. In the present study, we distinguish between relationships found across the full set of genomes as the `core-regulatory-set', and interactions found only in a subset of genomes explored as the `sub-regulatory-set'. We found nine Fur target gene clusters present across the four genomes studied, this core set potentially identifying basic regulatory processes essential for survival. Species level difierences are seen at the sub-regulatory-set level; for example the known virulence factors, YbtA and PchR were found in Y.pestis and P.aerguinosa respectively, but were not present in both E.coli and B.subtilis. Such factors and the iron-uptake systems they regulate, are ideal candidates for wet-lab investigation to determine whether or not they are pathogenic specific. In this study, we employed a broad range of approaches to address our goals and assessed these methods using the Fur regulon as our initial case study. We identified a set of promising feature attributes; demonstrated their success in increasing transcription factor binding site prediction specificity while retaining sensitivity, and showed the importance of binding site predictions in enhancing the reliability of regulatory interaction inferences. Most importantly, these outcomes led to the introduction of a range of visualisations and techniques, which are applicable across the entire bacterial spectrum and can be utilised in studies beyond the understanding of transcriptional regulatory networks.