365 resultados para atk-ohjelmat - LSP - Library software package
Resumo:
Colorimetric analysis of roadway dust is currently a method for monitoring the incombustible content of mine roadways within Australian underground coal mines. To test the accuracy of this method, and to eliminate errors of judgement introduced by human operators in the analysis procedure, a number of samples were tested using scanning software to determine absolute greyscale values. High variability and unpredictability of results was noted during this testing, indicating that colorimetric testing is sensitive to parameters within the mine that are not currently reproduced in the preparation of reference samples. This was linked to the dependence of colour on particle surface area, and hence also to the size distribution of coal particles within the mine environment. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Qualitative data analysis (QDA) is often a time-consuming and laborious process usually involving the management of large quantities of textual data. Recently developed computer programs offer great advances in the efficiency of the processes of QDA. In this paper we report on an innovative use of a combination of extant computer software technologies to further enhance and simplify QDA. Used in appropriate circumstances, we believe that this innovation greatly enhances the speed with which theoretical and descriptive ideas can be abstracted from rich, complex, and chaotic qualitative data. © 2001 Human Sciences Press, Inc.
Resumo:
The development of cropping systems simulation capabilities world-wide combined with easy access to powerful computing has resulted in a plethora of agricultural models and consequently, model applications. Nonetheless, the scientific credibility of such applications and their relevance to farming practice is still being questioned. Our objective in this paper is to highlight some of the model applications from which benefits for farmers were or could be obtained via changed agricultural practice or policy. Changed on-farm practice due to the direct contribution of modelling, while keenly sought after, may in some cases be less achievable than a contribution via agricultural policies. This paper is intended to give some guidance for future model applications. It is not a comprehensive review of model applications, nor is it intended to discuss modelling in the context of social science or extension policy. Rather, we take snapshots around the globe to 'take stock' and to demonstrate that well-defined financial and environmental benefits can be obtained on-farm from the use of models. We highlight the importance of 'relevance' and hence the importance of true partnerships between all stakeholders (farmer, scientists, advisers) for the successful development and adoption of simulation approaches. Specifically, we address some key points that are essential for successful model applications such as: (1) issues to be addressed must be neither trivial nor obvious; (2) a modelling approach must reduce complexity rather than proliferate choices in order to aid the decision-making process (3) the cropping systems must be sufficiently flexible to allow management interventions based on insights gained from models. The pro and cons of normative approaches (e.g. decision support software that can reach a wide audience quickly but are often poorly contextualized for any individual client) versus model applications within the context of an individual client's situation will also be discussed. We suggest that a tandem approach is necessary whereby the latter is used in the early stages of model application for confidence building amongst client groups. This paper focuses on five specific regions that differ fundamentally in terms of environment and socio-economic structure and hence in their requirements for successful model applications. Specifically, we will give examples from Australia and South America (high climatic variability, large areas, low input, technologically advanced); Africa (high climatic variability, small areas, low input, subsistence agriculture); India (high climatic variability, small areas, medium level inputs, technologically progressing; and Europe (relatively low climatic variability, small areas, high input, technologically advanced). The contrast between Australia and Europe will further demonstrate how successful model applications are strongly influenced by the policy framework within which producers operate. We suggest that this might eventually lead to better adoption of fully integrated systems approaches and result in the development of resilient farming systems that are in tune with current climatic conditions and are adaptable to biophysical and socioeconomic variability and change. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
We recently demonstrated that Saccharomyces cerevisiae protoplasts can take up bovine papillomavirus type 1 (BPV1) virions and that viral episomal DNA is replicated after uptake. Here we demonstrate that BPV virus-like particles are assembled in infected S. cerevisiae cultures from newly synthesized capsid proteins and also package newly synthesized DNA, including full-length and truncated viral DNA and S. cerevisiae-derived DNA. Virus particles prepared in S. cerevisiae are able to convey packaged DNA to Cos1 cells and to transform C127 cells. Infectivity was blocked by antisera to BPV1 L1 but not antisera to BPV1 E4. We conclude that S. cerevisiae is permissive for the replication of BPV1 virus.
Resumo:
Motivation: This paper introduces the software EMMIX-GENE that has been developed for the specific purpose of a model-based approach to the clustering of microarray expression data, in particular, of tissue samples on a very large number of genes. The latter is a nonstandard problem in parametric cluster analysis because the dimension of the feature space (the number of genes) is typically much greater than the number of tissues. A feasible approach is provided by first selecting a subset of the genes relevant for the clustering of the tissue samples by fitting mixtures of t distributions to rank the genes in order of increasing size of the likelihood ratio statistic for the test of one versus two components in the mixture model. The imposition of a threshold on the likelihood ratio statistic used in conjunction with a threshold on the size of a cluster allows the selection of a relevant set of genes. However, even this reduced set of genes will usually be too large for a normal mixture model to be fitted directly to the tissues, and so the use of mixtures of factor analyzers is exploited to reduce effectively the dimension of the feature space of genes. Results: The usefulness of the EMMIX-GENE approach for the clustering of tissue samples is demonstrated on two well-known data sets on colon and leukaemia tissues. For both data sets, relevant subsets of the genes are able to be selected that reveal interesting clusterings of the tissues that are either consistent with the external classification of the tissues or with background and biological knowledge of these sets.
Resumo:
This paper presents the results of my action research. I was involved in establishing and running a digital library that was founded by the government of South Korea. The process involved understanding the relationship between the national IT infrastructure and the success factors of the digital library. In building, the national IT infrastructure, a digital library system was implemented; it combines all existing digitized university libraries and can provide overseas information, such as foreign journal articles, instantly and freely to every Korean researcher. An empirical survey was made as a part of the action research; the survey determined user satisfaction in the newly established national digital library. After obtaining the survey results, I suggested that the current way of running the nationwide government-owned digital library should be retained. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
We have previously reported successful trans-complementation of defective Kunjin virus genomic RNAs with a range of large lethal deletions in the nonstructural genes NSI, NS3, and NS5 (A. A. Khromykh et al., J. Virol. 74:3253-3263, 2000). In this study we have mapped further the minimal region in the NS5 gene essential for efficient trans-complementation of genome-length RNAs in repBHK cells to the first 316 of the 905 codons. To allow amplification and easy detection of complemented defective RNAs with deletions apparently affecting virus assembly, we have developed a dual replicon complementation system. In this system defective replicon RNAs with a deletion(s) in the nonstructural genes also encoded the puromycin resistance gene (PAC gene) and the reporter gene for beta-galactosidase (beta-Gal). Complementation of these defective replicon RNAs in repBHK cells resulted in expression of PAC and beta-Gal which allowed establishment of cell lines stably producing replicating defective RNAs by selection with puromycin and comparison of replication efficiencies of complemented defective RNAs by beta-Gal assay. Using this system we demonstrated that deletions in the C-terminal 434 codons of NS3 (codons 178 to 611) were complemented for RNA replication, while any deletions in the first 178 codons were not. None of the genome-length RNAs containing deletions in NS3 shown to be complementable for RNA replication produced secreted defective viruses during complementation in repBHK cells. In contrast, structural proteins produced from these complemented defective RNAs were able to package helper replicon RNA. The results define minimal regions in the NS3 and NS5 genes essential for the formation of complementable replication complex and show a requirement of NS3 in cis for virus assembly.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
It is common for a real-time system to contain a nonterminating process monitoring an input and controlling an output. Hence, a real-time program development method needs to support nonterminating repetitions. In this paper we develop a general proof rule for reasoning about possibly nonterminating repetitions. The rule makes use of a Floyd-Hoare-style loop invariant that is maintained by each iteration of the repetition, a Jones-style relation between the pre- and post-states on each iteration, and a deadline specifying an upper bound on the starting time of each iteration. The general rule is proved correct with respect to a predicative semantics. In the case of a terminating repetition the rule reduces to the standard rule extended to handle real time. Other special cases include repetitions whose bodies are guaranteed to terminate, nonterminating repetitions with the constant true as a guard, and repetitions whose termination is guaranteed by the inclusion of a fixed deadline. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
The personal computer revolution has resulted in the widespread availability of low-cost image analysis hardware. At the same time, new graphic file formats have made it possible to handle and display images at resolutions beyond the capability of the human eye. Consequently, there has been a significant research effort in recent years aimed at making use of these hardware and software technologies for flotation plant monitoring. Computer-based vision technology is now moving out of the research laboratory and into the plant to become a useful means of monitoring and controlling flotation performance at the cell level. This paper discusses the metallurgical parameters that influence surface froth appearance and examines the progress that has been made in image analysis of flotation froths. The texture spectrum and pixel tracing techniques developed at the Julius Kruttschnitt Mineral Research Centre are described in detail. The commercial implementation, JKFrothCam, is one of a number of froth image analysis systems now reaching maturity. In plants where it is installed, JKFrothCam has shown a number of performance benefits. Flotation runs more consistently, meeting product specifications while maintaining high recoveries. The system has also shown secondary benefits in that reagent costs have been significantly reduced as a result of improved flotation control. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Radio-frequency (RF) coils are a necessary component of magnetic resonance imaging (MRI) systems. When used in transmit operation, they act to generate a homogeneous RF magnetic field within a volume of interest and when in receive operation, they act to receive the nuclear magnetic resonance signal from the RF-excited specimen. This paper outlines a procedure for the design of open RF coils using the time-harmonic inverse method. This method entails the calculation of an ideal current density on a multipaned planar surface that would generate a specified magnetic field within the volume of interest. Because of the averaging effect of the regularization technique in the matrix solution, the specified magnetic field is shaped within an iterative procedure until the generated magnetic field matches the desired magnetic field. The stream-function technique is used to ascertain conductor positions and a method of moments package is then used to finalize the design. An open head/neck coil was designed to operate in a clinical 2T MRI system and the presented results prove the efficacy of this design methodology.