78 resultados para STANDARD AUTOMATED PERIMETRY
Resumo:
Recent research has shown that Lighthill–Ford spontaneous gravity wave generation theory, when applied to numerical model data, can help predict areas of clear-air turbulence. It is hypothesized that this is the case because spontaneously generated atmospheric gravity waves may initiate turbulence by locally modifying the stability and wind shear. As an improvement on the original research, this paper describes the creation of an ‘operational’ algorithm (ULTURB) with three modifications to the original method: (1) extending the altitude range for which the method is effective downward to the top of the boundary layer, (2) adding turbulent kinetic energy production from the environment to the locally produced turbulent kinetic energy production, and, (3) transforming turbulent kinetic energy dissipation to eddy dissipation rate, the turbulence metric becoming the worldwide ‘standard’. In a comparison of ULTURB with the original method and with the Graphical Turbulence Guidance second version (GTG2) automated procedure for forecasting mid- and upper-level aircraft turbulence ULTURB performed better for all turbulence intensities. Since ULTURB, unlike GTG2, is founded on a self-consistent dynamical theory, it may offer forecasters better insight into the causes of the clear-air turbulence and may ultimately enhance its predictability.
Resumo:
Automatic keyword or keyphrase extraction is concerned with assigning keyphrases to documents based on words from within the document. Previous studies have shown that in a significant number of cases author-supplied keywords are not appropriate for the document to which they are attached. This can either be because they represent what the author believes a paper is about not what it actually is, or because they include keyphrases which are more classificatory than explanatory e.g., “University of Poppleton” instead of “Knowledge Discovery in Databases”. Thus, there is a need for a system that can generate an appropriate and diverse range of keyphrases that reflect the document. This paper proposes two possible solutions that examine the synonyms of words and phrases in the document to find the underlying themes, and presents these as appropriate keyphrases. Using three different freely available thesauri, the work undertaken examines two different methods of producing keywords and compares the outcomes across multiple strands in the timeline. The primary method explores taking n-grams of the source document phrases, and examining the synonyms of these, while the secondary considers grouping outputs by their synonyms. The experiments undertaken show the primary method produces good results and that the secondary method produces both good results and potential for future work. In addition, the different qualities of the thesauri are examined and it is concluded that the more entries in a thesaurus, the better it is likely to perform. The age of the thesaurus or the size of each entry does not correlate to performance.
Resumo:
A recent article in this journal challenged claims that a human rights framework should be applied to drug control. This article questions the author’s assertions and reframes them in the context of socio-legal drug scholarship, aiming to build on the discourse concerning human rights and drug use. It is submitted that a rights-based approach is a necessary, indeed obligatory, ethical and legal framework through which to address drug use and that international human rights law provides the proper scope for determining where interferences with individual human rights might be justified on certain, limited grounds.
Resumo:
The proteome of Salmonella enterica serovar Typhimurium was characterized by 2-dimensional HPLC mass spectrometry to provide a platform for subsequent proteomic investigations of low level multiple antibiotic resistance (MAR). Bacteria (2.15 +/- 0.23 x 10(10) cfu; mean +/- s.d.) were harvested from liquid culture and proteins differentially fractionated, on the basis of solubility, into preparations representative of the cytosol, cell envelope and outer membrane proteins (OMPs). These preparations were digested by treatment with trypsin and peptides separated into fractions (n = 20) by strong cation exchange chromatography (SCX). Tryptic peptides in each SCX fraction were further separated by reversed-phase chromatography and detected by mass spectrometry. Peptides were assigned to proteins and consensus rank listings compiled using SEQUEST. A total of 816 +/- 11 individual proteins were identified which included 371 +/- 33, 565 +/- 15 and 262 +/- 5 from the cytosolic, cell envelope and OMP preparations, respectively. A significant correlation was observed (r(2) = 0.62 +/- 0.10; P < 0.0001) between consensus rank position for duplicate cell preparations and an average of 74 +/- 5% of proteins were common to both replicates. A total of 34 outer membrane proteins were detected, 20 of these from the OMP preparation. A range of proteins (n = 20) previously associated with the mar locus in E. coli were also found including the key MAR effectors AcrA, TolC and OmpF.
Resumo:
Aims: Quinolone antibiotics are the agents of choice for treating systemic Salmonella infections. Resistance to quinolones is usually mediated by mutations in the DNA gyrase gene gyrA. Here we report the evaluation of standard HPLC equipment for the detection of mutations (single nucleotide polymorphisms; SNPs) in gyrA, gyrB, parC and parE by denaturing high performance liquid chromatography (DHPLC). Methods: A panel of Salmonella strains was assembled which comprised those with known different mutations in gyrA (n = 8) and fluoroquinolone-susceptible and -resistant strains (n = 50) that had not been tested for mutations in gyrA. Additionally, antibiotic-susceptible strains of serotypes other than Salmonella enterica serovar Typhimurium strains were examined for serotype-specific mutations in gyrB (n = 4), parC (n = 6) and parE (n = 1). Wild-type (WT) control DNA was prepared from Salmonella Typhimurium NCTC 74. The DNA of respective strains was amplified by PCR using Optimase (R) proofreading DNA polymerase. Duplex DNA samples were analysed using an Agilent A1100 HPLC system with a Varian Helix (TM) DNA column. Sequencing was used to validate mutations detected by DHPLC in the strains with unknown mutations. Results: Using this HPLC system, mutations in gyrA, gyrB, parC and parE were readily detected by comparison with control chromatograms. Sequencing confirmed the gyrA predicted mutations as detected by DHPLC in the unknown strains and also confirmed serotype-associated sequence changes in non-Typhimurium serotypes. Conclusions: The results demonstrated that a non-specialist standard HPLC machine fitted with a generally available column can be used to detect SNPs in gyrA, gyrB, parC and parE genes by DHPLC. Wider applications should be possible.
Resumo:
This research presents a novel multi-functional system for medical Imaging-enabled Assistive Diagnosis (IAD). Although the IAD demonstrator has focused on abdominal images and supports the clinical diagnosis of kidneys using CT/MRI imaging, it can be adapted to work on image delineation, annotation and 3D real-size volumetric modelling of other organ structures such as the brain, spine, etc. The IAD provides advanced real-time 3D visualisation and measurements with fully automated functionalities as developed in two stages. In the first stage, via the clinically driven user interface, specialist clinicians use CT/MRI imaging datasets to accurately delineate and annotate the kidneys and their possible abnormalities, thus creating “3D Golden Standard Models”. Based on these models, in the second stage, clinical support staff i.e. medical technicians interactively define model-based rules and parameters for the integrated “Automatic Recognition Framework” to achieve results which are closest to that of the clinicians. These specific rules and parameters are stored in “Templates” and can later be used by any clinician to automatically identify organ structures i.e. kidneys and their possible abnormalities. The system also supports the transmission of these “Templates” to another expert for a second opinion. A 3D model of the body, the organs and their possible pathology with real metrics is also integrated. The automatic functionality was tested on eleven MRI datasets (comprising of 286 images) and the 3D models were validated by comparing them with the metrics from the corresponding “3D Golden Standard Models”. The system provides metrics for the evaluation of the results, in terms of Accuracy, Precision, Sensitivity, Specificity and Dice Similarity Coefficient (DSC) so as to enable benchmarking of its performance. The first IAD prototype has produced promising results as its performance accuracy based on the most widely deployed evaluation metric, DSC, yields 97% for the recognition of kidneys and 96% for their abnormalities; whilst across all the above evaluation metrics its performance ranges between 96% and 100%. Further development of the IAD system is in progress to extend and evaluate its clinical diagnostic support capability through development and integration of additional algorithms to offer fully computer-aided identification of other organs and their abnormalities based on CT/MRI/Ultra-sound Imaging.
Resumo:
Red tape is not desirable as it impedes business growth. Relief from the administrative burdens that businesses face due to legislation can benefit the whole economy, especially at times of recession. However, recent governmental initiatives aimed at reducing administrative burdens have encountered some success, but also failures. This article compares three national initiatives - in the Netherlands, UK and Italy - aimed at cutting red tape by using the Standard Cost Model. Findings highlight the factors affecting the outcomes of measurement and reduction plans and ways to improve the Standard Cost Model methodology.
Resumo:
Keyphrases are added to documents to help identify the areas of interest they contain. However, in a significant proportion of papers author selected keyphrases are not appropriate for the document they accompany: for instance, they can be classificatory rather than explanatory, or they are not updated when the focus of the paper changes. As such, automated methods for improving the use of keyphrases are needed, and various methods have been published. However, each method was evaluated using a different corpus, typically one relevant to the field of study of the method’s authors. This not only makes it difficult to incorporate the useful elements of algorithms in future work, but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of corpora. The methods chosen were Term Frequency, Inverse Document Frequency, the C-Value, the NC-Value, and a Synonym based approach. These methods were analysed to evaluate performance and quality of results, and to provide a future benchmark. It is shown that Term Frequency and Inverse Document Frequency were the best algorithms, with the Synonym approach following them. Following these findings, a study was undertaken into the value of using human evaluators to judge the outputs. The Synonym method was compared to the original author keyphrases of the Reuters’ News Corpus. The findings show that authors of Reuters’ news articles provide good keyphrases but that more often than not they do not provide any keyphrases.
Resumo:
This chapter introduces the latest practices and technologies in the interactive interpretation of environmental data. With environmental data becoming ever larger, more diverse and more complex, there is a need for a new generation of tools that provides new capabilities over and above those of the standard workhorses of science. These new tools aid the scientist in discovering interesting new features (and also problems) in large datasets by allowing the data to be explored interactively using simple, intuitive graphical tools. In this way, new discoveries are made that are commonly missed by automated batch data processing. This chapter discusses the characteristics of environmental science data, common current practice in data analysis and the supporting tools and infrastructure. New approaches are introduced and illustrated from the points of view of both the end user and the underlying technology. We conclude by speculating as to future developments in the field and what must be achieved to fulfil this vision.
Resumo:
The paper seeks to explore in depth the ways in which rhetorical strategies are employed in the international accounting standard setting process. The study proposes that rather than simply detailing new accounting requirements, the texts and drafts of accounting standards are artefacts, i.e. deliberately and carefully crafted products, that construct, persuade and encourage certain beliefs and behaviours. The persuasive and constructive strategies are also employed by the constituents submitting comment letters on the regulatory proposals. Consequently, the international accounting standard setting process is an ‘interactive process of meaning making’ (Fairclough, 1989). The study regards accounting as a social construct based on intersubjectivity (Searle, 1995; Davidson, 1990, 1994) and posits language as a constitutive factor in the process (Saussure, 1916; Peirce, 1931-58). This approach to the use of language and the role of rhetoric as a persuasive tool to convince others to our perception of ‘accounting reality’ is supported by the sociological work of Bourdieu (1990, 1991). Bourdieu has drawn our attention to how language becomes used, controlled, reformed and reconstituted by the social agents for the purposes of establishing their dominance. In our study we explore in particular the joint IASB and FASB proposals and subsequent regulations on the scope of consolidation and relevant disclosures that address issues of off-balance sheet financing, a subject that is very timely and of great topical importance. The analysis has revealed sophisticated rhetorical devices used by both the Boards and by the lobbyists. These reflect Aristotelian ethos, pathos and logos. The research demonstrates that those using accounting standards as well as those reading comment letters on the proposals for new standards should be aware of the normative nature of these documents and the subjectivity inherent in the nature of the text.
Resumo:
Sea ice contains flaws including frictional contacts. We aim to describe quantitatively the mechanics of those contacts, providing local physics for geophysical models. With a focus on the internal friction of ice, we review standard micro-mechanical models of friction. The solid's deformation under normal load may be ductile or elastic. The shear failure of the contact may be by ductile flow, brittle fracture, or melting and hydrodynamic lubrication. Combinations of these give a total of six rheological models. When the material under study is ice, several of the rheological parameters in the standard models are not constant, but depend on the temperature of the bulk, on the normal stress under which samples are pressed together, or on the sliding velocity and acceleration. This has the effect of making the shear stress required for sliding dependent on sliding velocity, acceleration, and temperature. In some cases, it also perturbs the exponent in the normal-stress dependence of that shear stress away from the value that applies to most materials. We unify the models by a principle of maximum displacement for normal deformation, and of minimum stress for shear failure, reducing the controversy over the mechanism of internal friction in ice to the choice of values of four parameters in a single model. The four parameters represent, for a typical asperity contact, the sliding distance required to expel melt-water, the sliding distance required to break contact, the normal strain in the asperity, and the thickness of any ductile shear zone.
Resumo:
Purpose – This paper extends the increasing debates about the role of international experience through mechanisms other than standard expatriation packages, in particular through the use of short-term assignments. It explores the different forms of short-term assignments (project work, commuter assignments, virtual international working and development assignments) and the different sets of positive and negative implications these can have for the company and the individuals concerned. The integration-differentiation debate is reflected here as elsewhere in IHRM, with the company moving towards greater centralization and control of its use of these assignments. Design/methodology/approach – Since the research is exploratory, we adopted a qualitative approach to get a more in-depth understanding on the realities the corporations and the assignees are facing. The study was implemented through a single case study setting in which the data were collected by interviewing (n=20) line managers, human resource management (HRM) staff and assignees themselves. In addition corporate documentation and other materials were reviewed. Findings – The present case study provides evidence about the characteristics of short-term assignments as well as the on the management of such assignments. The paper identifies various benefits and challenges involved in the use of short-term assignments both from the perspectives of the company and assignees. Furthermore, the findings support the view that a recent increase in the popularity of short-term assignments has not been matched by the development of HRM policies for such assignments. Research limitations/implications – As a single case study, limitations in the generalizability of the findings should be kept in mind. More large-scale research evidence is needed around different forms of international assignments beyond standard expatriation in order to fully capture the realities faced by international HRM specialists Practical implications – The paper identifies many challenges but also benefits of using short-term assignments. The paper reports in-depth findings on HR development needs that organizations face when expanding the use of such assignments. Social implications – The paper identifies many challenges but also benefits of using short-term assignments. The paper reports in-depth findings on HR development needs that organizations face when expanding the use of such assignments. Originality/value – Empirical research on short-term assignments is still very limited. In that way the paper provides much needed in-depth evidence on why such assignments are used, what challenges are involved in the use of such assignments and what kinds of HR-development needs are involved.
Resumo:
In recent years both developed and developing countries have experienced an increasing number of government initiatives dedicated to reducing the administrative costs (AC) imposed on businesses by regulation. We use a bi-linear fixed-effects model to analyze the extent to which government initiatives to reduce AC through the Standard Cost Model (SCM) attract Foreign Direct Investment (FDI) among 32 developing countries. Controlling for standard determinants of the SCM, we find that the SCM in most cases leads to higher FDI and that the benefits are more significant where the SCM has been implemented for a longer period.
Resumo:
Studies of code-switching in writing are very limited in comparison with the numerous investigations of this phenomenon in oral communication. Recent research has revealed that in text-based computer-mediated communication internet users bring into play the various languages available in their linguistic repertoire and, consequently, switch between them. In this case study, I investigate digital code-switching between Cypriot and Standard Greek, the two varieties of Greek spoken on the island of Cyprus. Following Auer’s conversation analytic approach and Gafaranga’s view that conversational structure coexists with social structure, I investigate code-switching in online interactions. The data to be analysed here, unlike those considered in most studies of code-switching, are written data, obtained from channel #Cyprus of Internet Relay Chat. The results suggest that code-switching in writing is influenced not only by macro-sociolinguistic factors, but they are also shaped by the medium- and social-specific characteristics of Internet Relay Chat. This, in turn, allows internet users to gain access to different roles and perform various identities within this online context.
Resumo:
Cypriot Greek, a variety of Greek spoken in the island of Cyprus, is relatively distinct from Standard Greek in all linguistic domains. The regional variety does not have a standard, official orthography and it is rarely used for everyday written purposes. Following technological development and the emergence of Computer-mediated Communication, a Romanized version of written CG is now widely used in online text-based communication, among teenagers and young adults (Themistocleous, C. (2008), The use of Cypriot-Greek in synchronous computer-mediated communication (PhD thesis), University of Manchester). In this study, I present the innovative ways that Greek-Cypriots use Roman characters in an effort to represent features of their spoken language in their online writings. By analysing data obtained from channel #Cyprus of Internet Relay Chat, I demonstrate how the choice of writing in CG affects the ways that Roman characters are used. I argue that this practice is not just a response to technological constrains but it actually has a wider social significance.