952 resultados para Non-Archimedean Real Closed Fields
Resumo:
In the case of industrial relations research, particularly that which sets out to examine practices within workplaces, the best way to study this real-life context is to work for the organisation. Studies conducted by researchers working within the organisation comprise some of the (broad) field’s classic research (cf. Roy, 1954; Burawoy, 1979). Participant and non-participant ethnographic research provides an opportunity to investigate workplace behaviour beyond the scope of questionnaires and interviews. However, we suggest that the data collected outside a workplace can be just as important as the data collected inside the organisation’s walls. In recent years the introduction of anti-smoking legislation in Australia has meant that people who smoke cigarettes are no longer allowed to do so inside buildings. Not only are smokers forced outside to engage in their habit, but they have to smoke prescribed distances from doorways, or in some workplaces outside the property line. This chapter considers the importance of cigarette-smoking employees in ethnographic research. Through data collected across three separate research projects, the chapter argues that smokers, as social outcasts in the workplace, can provide a wealth of important research data. We suggest that smokers also appear more likely to provide stories that contradict the ‘management’ or ‘organisational’ position. Thus, within the haze of smoke, researchers can uncover a level of discontent with the ‘corporate line’ presented inside the workplace. There are several aspects to the increased propensity of smokers to provide a contradictory or discontented story. It may be that the researcher is better able to establish a rapport with smokers, as there is a removal of the artificial wall a researcher presents as an outsider. It may also be that a research location physically outside the boundaries of the organisation provides workers with the freedom to express their discontent. The authors offer no definitive answers; rather, this chapter is intended to extend our knowledge of workplace research through highlighting the methodological value in using smokers as research subjects. We present the experience of three separate case studies where interactions with cigarette smokers have provided either important organisational data or alternatively a means of entering what Cunnison (1966) referred to as the ‘gossip circle’. The final section of the chapter draws on the evidence to demonstrate how the community of smokers, as social outcasts, are valuable in investigating workplace issues. For researchers and practitioners, these social outcasts may very well prove to be an important barometer of employee attitudes; attitudes that perhaps cannot be measured through traditional staff surveys.
Resumo:
In this paper, a new power sharing control method for a microgrid with several distributed generation units is proposed. The presence of both inertial and noninertial sources with different power ratings, maximum power point tracking, and various types of loads pose a great challenge for the power sharing and system stability. The conventional droop control method is modified to achieve the desired power sharing ensuring system stability in a highly resistive network. A transformation matrix is formed to derive equivalent real and reactive power output of the converter and equivalent feedback gain matrix for the modified droop equation. The proposed control strategy, aimed for the prototype microgrid planned at Queensland University of Technology, is validated through extensive simulation results using PSCAD/EMTDC software.
Resumo:
Modern Engineering Asset Management (EAM) requires the accurate assessment of current and the prediction of future asset health condition. Suitable mathematical models that are capable of predicting Time-to-Failure (TTF) and the probability of failure in future time are essential. In traditional reliability models, the lifetime of assets is estimated using failure time data. However, in most real-life situations and industry applications, the lifetime of assets is influenced by different risk factors, which are called covariates. The fundamental notion in reliability theory is the failure time of a system and its covariates. These covariates change stochastically and may influence and/or indicate the failure time. Research shows that many statistical models have been developed to estimate the hazard of assets or individuals with covariates. An extensive amount of literature on hazard models with covariates (also termed covariate models), including theory and practical applications, has emerged. This paper is a state-of-the-art review of the existing literature on these covariate models in both the reliability and biomedical fields. One of the major purposes of this expository paper is to synthesise these models from both industrial reliability and biomedical fields and then contextually group them into non-parametric and semi-parametric models. Comments on their merits and limitations are also presented. Another main purpose of this paper is to comprehensively review and summarise the current research on the development of the covariate models so as to facilitate the application of more covariate modelling techniques into prognostics and asset health management.
Resumo:
Extended spectrum β-lactamases or ESBLs, which are derived from non-ESBL precursors by point mutation of β-lactamase genes (bla), are spreading rapidly all over the world and have caused considerable problems in the treatment of infections caused by bacteria which harbour them. The mechanism of this resistance is not fully understood and a better understanding of these mechanisms might significantly impact on choosing proper diagnostic and treatment strategies. Previous work on SHV β-lactamase gene, blaSHV, has shown that only Klebsiella pneumoniae strains which contain plasmid-borne blaSHV are able to mutate to phenotypically ESBL-positive strains and there was also evidence of an increase in blaSHV copy number. Therefore, it was hypothesised that although specific point mutation is essential for acquisition of ESBL activity, it is not yet enough, and blaSHV copy number amplification is also essential for an ESBL-positive phenotype, with homologous recombination being the likely mechanism of blaSHV copy number expansion. In this study, we investigated the mutation rate of non-ESBL expressing K. pneumoniae isolates to an ESBL-positive status by using the MSS-maximum likelihood method. Our data showed that blaSHV mutation rate of a non-ESBL expressing isolate is lower than the mutation rate of the other single base changes on the chromosome, even with a plasmid-borne blaSHV gene. On the other hand, mutation rate from a low MIC ESBL-positive (≤ 8 µg/mL for cefotaxime) to high MIC ESBL-positive (≥16 µg/mL for cefotaxime) is very high. This is because only gene copy number increase is needed which is probably mediated by homologous recombination that typically takes place at a much higher frequencies than point mutations. Using a subinhibitory concentration of novobiocin, as a homologous recombination inhibitor, revealed that this is the case.
Resumo:
For most of the work done in developing association rule mining, the primary focus has been on the efficiency of the approach and to a lesser extent the quality of the derived rules has been emphasized. Often for a dataset, a huge number of rules can be derived, but many of them can be redundant to other rules and thus are useless in practice. The extremely large number of rules makes it difficult for the end users to comprehend and therefore effectively use the discovered rules and thus significantly reduces the effectiveness of rule mining algorithms. If the extracted knowledge can’t be effectively used in solving real world problems, the effort of extracting the knowledge is worth little. This is a serious problem but not yet solved satisfactorily. In this paper, we propose a concise representation called Reliable Approximate basis for representing non-redundant approximate association rules. We prove that the redundancy elimination based on the proposed basis does not reduce the belief to the extracted rules. We also prove that all approximate association rules can be deduced from the Reliable Approximate basis. Therefore the basis is a lossless representation of approximate association rules.
Resumo:
This paper synthesises the existing literature on the contemporary conception of ‘real world’ and compares it with similar notions such as ‘authentic’ and ‘work integrated learning’. While the term ‘real world’ may be partly dependent on the discipline, it does not necessarily follow that the criterion-referenced assessment of ‘real world’ assessment must involve criteria and performance descriptors that are discipline specific. Two examples of summative assessment (court report and trial process exercise) from a final year core subject at the Queensland University of Technology, LWB432 Evidence, emphasise real world learning, are authentic, innovative and better prepare students for the transition into the workplace than more generic forms of assessment such as tutorial participation or oral presentations. The court report requires students to attend a criminal trial in a Queensland Court and complete a two page report on what they saw in practice compared with what they learned in the classroom. The trial process exercise is a 50 minute written closed book activity conducted in tutorials, where students plan questions that they would ask their witness in examination-in-chief, plan questions that they would ask their opponent’s witness in cross-examination, plan questions that they would ask in reexamination given what their opponent asked in cross-examination, and prepare written objections to their opponent’s questions. The trial process exercise simulates the real world, whereas the court report involves observing the real world, and both assessment items are important to the role of counsel. The design of the criterion-referenced assessment rubrics for the court report and trial process exercise is justified by the literature. Notably, the criteria and performance descriptors are not necessarily law specific and this paper highlights the parts that may be easily transferred to other disciplines.
Resumo:
The refractive error of a human eye varies across the pupil and therefore may be treated as a random variable. The probability distribution of this random variable provides a means for assessing the main refractive properties of the eye without the necessity of traditional functional representation of wavefront aberrations. To demonstrate this approach, the statistical properties of refractive error maps are investigated. Closed-form expressions are derived for the probability density function (PDF) and its statistical moments for the general case of rotationally-symmetric aberrations. A closed-form expression for a PDF for a general non-rotationally symmetric wavefront aberration is difficult to derive. However, for specific cases, such as astigmatism, a closed-form expression of the PDF can be obtained. Further, interpretation of the distribution of the refractive error map as well as its moments is provided for a range of wavefront aberrations measured in real eyes. These are evaluated using a kernel density and sample moments estimators. It is concluded that the refractive error domain allows non-functional analysis of wavefront aberrations based on simple statistics in the form of its sample moments. Clinicians may find this approach to wavefront analysis easier to interpret due to the clinical familiarity and intuitive appeal of refractive error maps.
Resumo:
Purpose: To compare the eye and head movements and lane-keeping of drivers with hemianopia and quadrantanopia with that of age-matched controls when driving under real world conditions. Methods: Participants included 22 hemianopes and 8 quadrantanopes (M age 53 yrs) and 30 persons with normal visual fields (M age 52 yrs) who were ≥ 6 months from the brain injury date and either a current driver or aiming to resume driving. All participants drove an instrumented dual-brake vehicle along a 14-mile route in traffic that included non-interstate city driving and interstate driving. Driving performance was scored using a standardised assessment system by two “backseat” raters and the Vigil Vanguard system which provides objective measures of speed, braking and acceleration, cornering, and video-based footage from which eye and head movements and lane-keeping can be derived. Results: As compared to drivers with normal visual fields, drivers with hemianopia or quadrantanopia on average were significantly more likely to drive slower, to exhibit less excessive cornering forces or acceleration, and to execute more shoulder movements off the seat. Those hemianopic and quadrantanopic drivers rated as safe to drive by the backseat evaluator made significantly more excursive eye movements, exhibited more stable lane positioning, less sudden braking events and drove at higher speeds than those rated as unsafe, while there was no difference between safe and unsafe drivers in head movements. Conclusions: Persons with hemianopic and quadrantanopic field defects rated as safe to drive have different driving characteristics compared to those rated as unsafe when assessed using objective measures of driving performance.
Resumo:
Discusses the role of legislation and codes of conduct in influencing the behaviour of non-executive directors. Outlines the functions of a board of directors and considers the role on non-executive directors in particular. Traces the development of standards of skill required on non-executive directors both under the Australian Corporations Act 2001 and under common law. Questions whether these have brought about a real change in behaviour. Considers whether professionalisation of directorship could be more effective.
Resumo:
Asset management in local government is an emerging discipline and over a decade has become a crucial aspect towards a more efficient and effective organisation. One crucial feature in the public asset management is performance measurement toward the public real estates. This measurement critically at the important component of public wealth and seeks to apply a standard of economic efficiency and effective organisational management especially in such global financial crisis condition. This paper aims to identify global economic crisis effect and proposes alternative solution for local governments to softening the impact of the crisis to the local governments organisation. This study found that the most suitable solution for local government to solve the global economic crisis in Indonesia is application of performance measurement in its asset management. Thus, it is important to develop performance measurement system in local government asset management process. This study provides suggestions from published documents and literatures. The paper also discusses the elements of public real estate performance measurement. The measurement of performance has become an essential component of the strategic thinking of assets owners and managers. Without having a formal measurement system for performance, it is difficult to plan, control and improve local government real estate management system. A close look at best practices in public sectors reveals that in most cases these practices were transferred from private sector reals estate management under the direction of real estate experts retained by government. One of the most significant advances in government property performance measurement resulted from recognition that the methodology used by private sector, non real estate corporations for managing their real property offered a valuable prototype for local governments. In general, there are two approaches most frequently used to measure performance of public organisations. Those are subjective and objective measures. Finally, findings from this study provides useful input for the local government policy makers, scholars and asset management practitioners to establish a public real estate performance measurement system toward more efficient and effective local governments in managing their assets as well as increasing public services quality in order to soften the impact of global financial crisis.
Resumo:
Campylobacter jejuni followed by Campylobacter coli contribute substantially to the economic and public health burden attributed to food-borne infections in Australia. Genotypic characterisation of isolates has provided new insights into the epidemiology and pathogenesis of C. jejuni and C. coli. However, currently available methods are not conducive to large scale epidemiological investigations that are necessary to elucidate the global epidemiology of these common food-borne pathogens. This research aims to develop high resolution C. jejuni and C. coli genotyping schemes that are convenient for high throughput applications. Real-time PCR and High Resolution Melt (HRM) analysis are fundamental to the genotyping schemes developed in this study and enable rapid, cost effective, interrogation of a range of different polymorphic sites within the Campylobacter genome. While the sources and routes of transmission of campylobacters are unclear, handling and consumption of poultry meat is frequently associated with human campylobacteriosis in Australia. Therefore, chicken derived C. jejuni and C. coli isolates were used to develop and verify the methods described in this study. The first aim of this study describes the application of MLST-SNP (Multi Locus Sequence Typing Single Nucleotide Polymorphisms) + binary typing to 87 chicken C. jejuni isolates using real-time PCR analysis. These typing schemes were developed previously by our research group using isolates from campylobacteriosis patients. This present study showed that SNP + binary typing alone or in combination are effective at detecting epidemiological linkage between chicken derived Campylobacter isolates and enable data comparisons with other MLST based investigations. SNP + binary types obtained from chicken isolates in this study were compared with a previously SNP + binary and MLST typed set of human isolates. Common genotypes between the two collections of isolates were identified and ST-524 represented a clone that could be worth monitoring in the chicken meat industry. In contrast, ST-48, mainly associated with bovine hosts, was abundant in the human isolates. This genotype was, however, absent in the chicken isolates, indicating the role of non-poultry sources in causing human Campylobacter infections. This demonstrates the potential application of SNP + binary typing for epidemiological investigations and source tracing. While MLST SNPs and binary genes comprise the more stable backbone of the Campylobacter genome and are indicative of long term epidemiological linkage of the isolates, the development of a High Resolution Melt (HRM) based curve analysis method to interrogate the hypervariable Campylobacter flagellin encoding gene (flaA) is described in Aim 2 of this study. The flaA gene product appears to be an important pathogenicity determinant of campylobacters and is therefore a popular target for genotyping, especially for short term epidemiological studies such as outbreak investigations. HRM curve analysis based flaA interrogation is a single-step closed-tube method that provides portable data that can be easily shared and accessed. Critical to the development of flaA HRM was the use of flaA specific primers that did not amplify the flaB gene. HRM curve analysis flaA interrogation was successful at discriminating the 47 sequence variants identified within the 87 C. jejuni and 15 C. coli isolates and correlated to the epidemiological background of the isolates. In the combinatorial format, the resolving power of flaA was additive to that of SNP + binary typing and CRISPR (Clustered regularly spaced short Palindromic repeats) HRM and fits the PHRANA (Progressive hierarchical resolving assays using nucleic acids) approach for genotyping. The use of statistical methods to analyse the HRM data enhanced sophistication of the method. Therefore, flaA HRM is a rapid and cost effective alternative to gel- or sequence-based flaA typing schemes. Aim 3 of this study describes the development of a novel bioinformatics driven method to interrogate Campylobacter MLST gene fragments using HRM, and is called ‘SNP Nucleated Minim MLST’ or ‘Minim typing’. The method involves HRM interrogation of MLST fragments that encompass highly informative “Nucleating SNPS” to ensure high resolution. Selection of fragments potentially suited to HRM analysis was conducted in silico using i) “Minimum SNPs” and ii) the new ’HRMtype’ software packages. Species specific sets of six “Nucleating SNPs” and six HRM fragments were identified for both C. jejuni and C. coli to ensure high typeability and resolution relevant to the MLST database. ‘Minim typing’ was tested empirically by typing 15 C. jejuni and five C. coli isolates. The association of clonal complexes (CC) to each isolate by ‘Minim typing’ and SNP + binary typing were used to compare the two MLST interrogation schemes. The CCs linked with each C. jejuni isolate were consistent for both methods. Thus, ‘Minim typing’ is an efficient and cost effective method to interrogate MLST genes. However, it is not expected to be independent, or meet the resolution of, sequence based MLST gene interrogation. ‘Minim typing’ in combination with flaA HRM is envisaged to comprise a highly resolving combinatorial typing scheme developed around the HRM platform and is amenable to automation and multiplexing. The genotyping techniques described in this thesis involve the combinatorial interrogation of differentially evolving genetic markers on the unified real-time PCR and HRM platform. They provide high resolution and are simple, cost effective and ideally suited to rapid and high throughput genotyping for these common food-borne pathogens.
Resumo:
We have developed a bioreactor vessel design which has the advantages of simplicity and ease of assembly and disassembly, and with the appropriately determined flow rate, even allows for a scaffold to be suspended freely regardless of its weight. This article reports our experimental and numerical investigations to evaluate the performance of a newly developed non-perfusion conical bioreactor by visualizing the flow through scaffolds with 45° and 90° fiber lay down patterns. The experiments were conducted at the Reynolds numbers (Re) 121, 170, and 218 based on the local velocity and width of scaffolds. The flow fields were captured using short-time exposures of 60 µm particles suspended in the bioreactor and illuminated using a thin laser sheet. The effects of scaffold fiber lay down pattern and Reynolds number were obtained and correspondingly compared to results obtained from a computational fluid dynamics (CFD) software package. The objectives of this article are twofold: to investigate the hypothesis that there may be an insufficient exchange of medium within the interior of the scaffold when using our non-perfusion bioreactor, and second, to compare the flows within and around scaffolds of 45° and 90° fiber lay down patterns. Scaffold porosity was also found to influence flow patterns. It was therefore shown that fluidic transport could be achieved within scaffolds with our bioreactor design, being a non-perfusion vessel. Fluid velocities were generally same of the same or one order lower in magnitude as compared to the inlet flow velocity. Additionally, the 90° fiber lay down pattern scaffold was found to allow for slightly higher fluid velocities within, as compared to the 45° fiber lay down pattern scaffold. This was due to the architecture and pore arrangement of the 90° fiber lay down pattern scaffold, which allows for fluid to flow directly through (channel-like flow).
Resumo:
The idealised theory for the quasi-static flow of granular materials which satisfy the Coulomb-Mohr hypothesis is considered. This theory arises in the limit that the angle of internal friction approaches $\pi/2$, and accordingly these materials may be referred to as being `highly frictional'. In this limit, the stress field for both two-dimensional and axially symmetric flows may be formulated in terms of a single nonlinear second order partial differential equation for the stress angle. To obtain an accompanying velocity field, a flow rule must be employed. Assuming the non-dilatant double-shearing flow rule, a further partial differential equation may be derived in each case, this time for the streamfunction. Using Lie symmetry methods, a complete set of group-invariant solutions is derived for both systems, and through this process new exact solutions are constructed. Only a limited number of exact solutions for gravity driven granular flows are known, so these results are potentially important in many practical applications. The problem of mass flow through a two-dimensional wedge hopper is examined as an illustration.
Resumo:
There has been much conjecture of late as to whether the patentable subject matter standard contains a physicality requirement. The issue came to a head when the Federal Circuit introduced the machine-or-transformation test in In re Bilski and declared it to be the sole test for determining subject matter eligibility. Many commentators criticized the test, arguing that it is inconsistent with Supreme Court precedent and the need for the patent system to respond appropriately to all new and useful innovation in whatever form it arises. Those criticisms were vindicated when, on appeal, the Supreme Court in Bilski v. Kappos dispensed with any suggestion that the patentable subject matter test involves a physicality requirement. In this article, the issue is addressed from a normative perspective: it asks whether the patentable subject matter test should contain a physicality requirement. The conclusion reached is that it should not, because such a limitation is not an appropriate means of encouraging much of the valuable innovation we are likely to witness during the Information Age. It is contended that it is not only traditionally-recognized mechanical, chemical and industrial manufacturing processes that are patent eligible, but that patent eligibility extends to include non-machine implemented and non-physical methods that do not have any connection with a physical device and do not cause a physical transformation of matter. Concerns raised that there is a trend of overreaching commoditization or propertization, where the boundaries of patent law have been expanded too far, are unfounded since the strictures of novelty, nonobviousness and sufficiency of description will exclude undeserving subject matter from patentability. The argument made is that introducing a physicality requirement will have unintended adverse effects in various fields of technology, particularly those emerging technologies that are likely to have a profound social effect in the future.
Resumo:
Association rule mining has contributed to many advances in the area of knowledge discovery. However, the quality of the discovered association rules is a big concern and has drawn more and more attention recently. One problem with the quality of the discovered association rules is the huge size of the extracted rule set. Often for a dataset, a huge number of rules can be extracted, but many of them can be redundant to other rules and thus useless in practice. Mining non-redundant rules is a promising approach to solve this problem. In this paper, we first propose a definition for redundancy, then propose a concise representation, called a Reliable basis, for representing non-redundant association rules. The Reliable basis contains a set of non-redundant rules which are derived using frequent closed itemsets and their generators instead of using frequent itemsets that are usually used by traditional association rule mining approaches. An important contribution of this paper is that we propose to use the certainty factor as the criterion to measure the strength of the discovered association rules. Using this criterion, we can ensure the elimination of as many redundant rules as possible without reducing the inference capacity of the remaining extracted non-redundant rules. We prove that the redundancy elimination, based on the proposed Reliable basis, does not reduce the strength of belief in the extracted rules. We also prove that all association rules, their supports and confidences, can be retrieved from the Reliable basis without accessing the dataset. Therefore the Reliable basis is a lossless representation of association rules. Experimental results show that the proposed Reliable basis can significantly reduce the number of extracted rules. We also conduct experiments on the application of association rules to the area of product recommendation. The experimental results show that the non-redundant association rules extracted using the proposed method retain the same inference capacity as the entire rule set. This result indicates that using non-redundant rules only is sufficient to solve real problems needless using the entire rule set.