943 resultados para automated full waveform logging system
Resumo:
The quality of sample inoculation is critical for achieving an optimal yield of discrete colonies in both monomicrobial and polymicrobial samples to perform identification and antibiotic susceptibility testing. Consequently, we compared the performance between the InoqulA (BD Kiestra), the WASP (Copan), and manual inoculation methods. Defined mono- and polymicrobial samples of 4 bacterial species and cloudy urine specimens were inoculated on chromogenic agar by the InoqulA, the WASP, and manual methods. Images taken with ImagA (BD Kiestra) were analyzed with the VisionLab version 3.43 image analysis software to assess the quality of growth and to prevent subjective interpretation of the data. A 3- to 10-fold higher yield of discrete colonies was observed following automated inoculation with both the InoqulA and WASP systems than that with manual inoculation. The difference in performance between automated and manual inoculation was mainly observed at concentrations of >10(6) bacteria/ml. Inoculation with the InoqulA system allowed us to obtain significantly more discrete colonies than the WASP system at concentrations of >10(7) bacteria/ml. However, the level of difference observed was bacterial species dependent. Discrete colonies of bacteria present in 100- to 1,000-fold lower concentrations than the most concentrated populations in defined polymicrobial samples were not reproducibly recovered, even with the automated systems. The analysis of cloudy urine specimens showed that InoqulA inoculation provided a statistically significantly higher number of discrete colonies than that with WASP and manual inoculation. Consequently, the automated InoqulA inoculation greatly decreased the requirement for bacterial subculture and thus resulted in a significant reduction in the time to results, laboratory workload, and laboratory costs.
Resumo:
Internationalization and the following rapid growth have created the need to concentrate the IT systems of many small-to-medium-sized production companies. Enterprise Resource Planning systems are a common solution for such companies. Deployment of these ERP systems consists of many steps, one of which is the implementation of the same shared system at all international subsidiaries. This is also one of the most important steps in the internationalization strategy of the company from the IT point of view. The mechanical process of creating the required connections for the off-shore sites is the easiest and most well-documented step along the way, but the actual value of the system, once operational, is perceived in its operational reliability. The operational reliability of an ERP system is a combination of many factors. These factors vary from hardware- and connectivity-related issues to administrative tasks and communication between decentralized administrative units and sites. To accurately analyze the operational reliability of such system, one must take into consideration the full functionality of the system. This includes not only the mechanical and systematic processes but also the users and their administration. All operational reliability in an international environment relies heavily on hardware and telecommunication adequacy so it is imperative to have resources dimensioned with regard to planned usage. Still with poorly maintained communication/administration schemes no amount of bandwidth or memory will be enough to maintain a productive level of reliability. This thesis work analyzes the implementation of a shared ERP system to an international subsidiary of a Finnish production company. The system is Microsoft Dynamics Ax, currently being introduced to a Slovakian facility, a subsidiary of Peikko Finland Oy. The primary task is to create a feasible base of analysis against which the operational reliability of the system can be evaluated precisely. With a solid analysis the aim is to give recommendations on how future implementations are to be managed.
Resumo:
The main target of the study was to examine how Fortum’s tax reporting system could be developed in a way that it collects required information which is also easily transferable to the financial statements. This included examining disclosure requirements for income taxes under IFRS and US GAAP. By benchmarking some Finnish, European and US companies the purpose was to get perspective in what extend they present their tax information in their financial statements. Also material weakness, its existence, was under examination. The research method was qualitative, descriptive and normative. The research material included articles and literature of the tax reporting and standards relating to it. The interviews made had a notable significance. The study pointed out that Fortum’s tax reporting is in good shape and it does not require big changes. The biggest renewal of the tax reporting system is that there is only one model for all Fortum’s companies. It is also more automated, quicker, and more efficient and it reminds more the notes in its shape. In addition it has more internal controls to improve quality and efficiency of the reporting process.
Resumo:
Automation was introduced many years ago in several diagnostic disciplines such as chemistry, haematology and molecular biology. The first laboratory automation system for clinical bacteriology was released in 2006, and it rapidly proved its value by increasing productivity, allowing a continuous increase in sample volumes despite limited budgets and personnel shortages. Today, two major manufacturers, BD Kiestra and Copan, are commercializing partial or complete laboratory automation systems for bacteriology. The laboratory automation systems are rapidly evolving to provide improved hardware and software solutions to optimize laboratory efficiency. However, the complex parameters of the laboratory and automation systems must be considered to determine the best system for each given laboratory. We address several topics on laboratory automation that may help clinical bacteriologists to understand the particularities and operative modalities of the different systems. We present (a) a comparison of the engineering and technical features of the various elements composing the two different automated systems currently available, (b) the system workflows of partial and complete laboratory automation, which define the basis for laboratory reorganization required to optimize system efficiency, (c) the concept of digital imaging and telebacteriology, (d) the connectivity of laboratory automation to the laboratory information system, (e) the general advantages and disadvantages as well as the expected impacts provided by laboratory automation and (f) the laboratory data required to conduct a workflow assessment to determine the best configuration of an automated system for the laboratory activities and specificities.
Resumo:
A 10-year experience of our automated molecular diagnostic platform that carries out 91 different real-time PCR is described. Progresses and future perspectives in molecular diagnostic microbiology are reviewed: why automation is important; how our platform was implemented; how homemade PCRs were developed; the advantages/disadvantages of homemade PCRs, including the critical aspects of troubleshooting and the need to further reduce the turnaround time for specific samples, at least for defined clinical settings such as emergencies. The future of molecular diagnosis depends on automation, and in a novel perspective, it is time now to fully acknowledge the true contribution of molecular diagnostic and to reconsider the indication for PCR, by also using these tests as first-line assays.
Resumo:
Recent standardization efforts in e-learning technology have resulted in a number of specifications, however, the automation process that is considered essential in a learning management system (LMS) is a lessexplored one. As learning technology becomes more widespread and more heterogeneous, there is a growing need to specify processes that cross the boundaries of a single LMS or learning resource repository. This article proposes to obtain a specification orientated to automation that takes on board the heterogeneity of systems and formats and provides a language for specifying complex and generic interactions. Having this goal in mind, a technique based on three steps is suggested. The semantic conformance profiles, the business process management (BPM) diagram, and its translation into the business process execution language (BPEL) seem to be suitable for achieving it.
Resumo:
The purpose of this bachelor's thesis is the development of online community. Nowadays Internet lets user to collaborate and share information online. Internet is also full of communities and the number of community users is continuously rising. Companies have also noticed this and want to make use of it. The result of the work was an online community for the use of PROFCOM research project. At the same time information was gathered about what kind of platforms are available as a backbone for an online community. Designing and developing of the online community provided experience about Drupal-environment. It also gave pros and cons of Drupal’s features. Drupal is a multifunctional software, which can handle big online communities, but its installation and maintenance is, however, reasonably simple.
Resumo:
The reduction of quantum scattering leads to the suppression of shot noise. In this Letter, we analyze the crossover from the quantum transport regime with universal shot noise to the classical regime where noise vanishes. By making use of the stochastic path integral approach, we find the statistics of transport and the transmission properties of a chaotic cavity as a function of a system parameter controlling the crossover. We identify three different scenarios of the crossover.
Resumo:
The spectrophotometric determination of Cd(II) using a flow injection system provided with a solid-phase reactor for cadmium preconcentration and on-line reagent preparation, is described. It is based on the formation of a dithizone-Cd complex in basic medium. The calibration curve is linear between 6 and 300 µg L-1 Cd(II), with a detection limit of 5.4 µg L-1, an RSD of 3.7% (10 replicates in duplicate) and a sample frequency of 11.4 h-1. The proposed method was satisfactorily applied to the determination of Cd(II) in surface, well and drinking waters.
Resumo:
We show how certain N-dimensional dynamical systems are able to exploit the full instability capabilities of their fixed points to do Hopf bifurcations and how such a behavior produces complex time evolutions based on the nonlinear combination of the oscillation modes that emerged from these bifurcations. For really different oscillation frequencies, the evolutions describe robust wave form structures, usually periodic, in which selfsimilarity with respect to both the time scale and system dimension is clearly appreciated. For closer frequencies, the evolution signals usually appear irregular but are still based on the repetition of complex wave form structures. The study is developed by considering vector fields with a scalar-valued nonlinear function of a single variable that is a linear combination of the N dynamical variables. In this case, the linear stability analysis can be used to design N-dimensional systems in which the fixed points of a saddle-node pair experience up to N21 Hopf bifurcations with preselected oscillation frequencies. The secondary processes occurring in the phase region where the variety of limit cycles appear may be rather complex and difficult to characterize, but they produce the nonlinear mixing of oscillation modes with relatively generic features
Resumo:
The present paper reports a bacteria autonomous controlled concentrator prototype with a user-friendly interface for bench-top applications. It is based on a micro-fluidic lab-on-a-chip and its associated custom instrumentation, which consists in a dielectrophoretic actuator, to pre-concentrate the sample, and an impedance analyser, to measure concentrated bacteria levels. The system is composed by a single micro-fluidic chamber with interdigitated electrodes and a instrumentation with custom electronics. The prototype is supported by a real-time platform connected to a remote computer, which automatically controls the system and displays impedance data used to monitor the status of bacteria accumulation on-chip. The system automates the whole concentrating operation. Performance has been studied for controlled volumes of Escherichia coli (E. coli) samples injected into the micro-fluidic chip at constant flow rate of 10 μL/min. A media conductivity correcting protocol has been developed, as the preliminary results showed distortion of the impedance analyser measurement produced by bacterial media conductivity variations through time. With the correcting protocol, the measured impedance values were related to the quantity of bacteria concentrated with a correlation of 0.988 and a coefficient of variation of 3.1%. Feasibility of E. coli on-chip automated concentration, using the miniaturized system, has been demonstrated. Furthermore, the impedance monitoring protocol had been adjusted and optimized, to handle changes in the electrical properties of the bacteria media over time.
Resumo:
A full two-level factorial design was employed to study the influence of PEG molar mass (MM PEG), PEG concentration (C PEG) and phosphate concentration (C PHOSPH) on proteases partition by Lentinus citrinus DPUA 1535 in a PEG/phosphate aqueous two-phase system (ATPS). For all ATPS studied, proteases partitioned for the top phase and the best proteases extraction condition was obtained with MM PEG = 6000 g mol-1, C PEG = 17.5% (w/w) and C PHOSPH = 25% (w/w) with (1.1) purification factor and (151%) activity yield. Findings reported here demonstrate a practical strategy that serves as a first step for proteases purification from crude extract by L. citrinus.
Resumo:
During the past decades testing has matured from ad-hoc activity into being an integral part of the development process. The benefits of testing are obvious for modern communication systems, which operate in heterogeneous environments amongst devices from various manufacturers. The increased demand for testing also creates demand for tools and technologies that support and automate testing activities. This thesis discusses applicability of visualization techniques in the result analysis part of the testing process. Particularly, the primary focus of this work is visualization of test execution logs produced by a TTCN-3 test system. TTCN-3 is an internationally standardized test specification and implementation language. The TTCN-3 standard suite includes specification of a test logging interface and a graphical presentation format, but no immediate relationship between them. This thesis presents a technique for mapping the log events to the graphical presentation format along with a concrete implementation, which is integrated with the Eclipse Platform and the OpenTTCN Tester toolchain. Results of this work indicate that for majority of the log events, a visual representation may be derived from the TTCN-3 standard suite. The remaining events were analysed and three categories relevant in either log analysis or implementation of the visualization tool were identified: events indicating insertion of something into the incoming queue of a port, events indicating a mismatch and events describing the control flow during the execution. Applicability of the results is limited into the domain of TTCN-3, but the developed mapping and the implementation may be utilized with any TTCN-3 tool that is able to produce the execution log in the standardized XML format.
Resumo:
Forest inventories are used to estimate forest characteristics and the condition of forest for many different applications: operational tree logging for forest industry, forest health state estimation, carbon balance estimation, land-cover and land use analysis in order to avoid forest degradation etc. Recent inventory methods are strongly based on remote sensing data combined with field sample measurements, which are used to define estimates covering the whole area of interest. Remote sensing data from satellites, aerial photographs or aerial laser scannings are used, depending on the scale of inventory. To be applicable in operational use, forest inventory methods need to be easily adjusted to local conditions of the study area at hand. All the data handling and parameter tuning should be objective and automated as much as possible. The methods also need to be robust when applied to different forest types. Since there generally are no extensive direct physical models connecting the remote sensing data from different sources to the forest parameters that are estimated, mathematical estimation models are of "black-box" type, connecting the independent auxiliary data to dependent response data with linear or nonlinear arbitrary models. To avoid redundant complexity and over-fitting of the model, which is based on up to hundreds of possibly collinear variables extracted from the auxiliary data, variable selection is needed. To connect the auxiliary data to the inventory parameters that are estimated, field work must be performed. In larger study areas with dense forests, field work is expensive, and should therefore be minimized. To get cost-efficient inventories, field work could partly be replaced with information from formerly measured sites, databases. The work in this thesis is devoted to the development of automated, adaptive computation methods for aerial forest inventory. The mathematical model parameter definition steps are automated, and the cost-efficiency is improved by setting up a procedure that utilizes databases in the estimation of new area characteristics.
Resumo:
This master’s thesis is focused on the active magnetic bearings system commissioning. The scope of the work is to test the existent procedures with old and new prototypes of an AMB system and additionally automate necessary steps instead of their hand tuning, because determination of rotor clearances and finding effective rotor-origins are time consuming and error prone. The final goal is to get a documented and mostly automated step by step methodology for end efficient system’s commissioning.