925 resultados para library automated system


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The instrument described in this report is an updated version of the high precision, automated Winkler titration system described by Friederich et al.(1984). The original instrument was based on the work of Bryan et al. (1976) who developed a colorimetric endpoint detector and on the work of Williams and Jenkinson (1982) who produced an automated system that used this detector. The goals of our updated version of the device described by Friederich et al. (1984) were as follows: 1) Move control of the system to the MS-DOS environment because HP-85 computers are no longer in production and because more user-friendly programs could be written using the IBM XT or AT computers that control the new device. 2) Use more "off the shelf" components and reduce the parts count in the new system so that it could be easily constructed and maintained. This report describes how to construct and use the new automated Winkler titration device. It also includes information on the chemistry of the Winkler titration, and detailed instructions on how to prepare reagents, collect samples, standardize and perform the titrations (Appendix I: Codispoti, L.A. 1991 On the determination of dissolved oxygen in sea water, 15pp.). A disk containing the program needed to operate the new device is also included. (pdf contains 33 pages)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Screening for residues of anabolic steroids frequently requires extraction from tissues and fluids before analysis. Chemical procedures for these extractions can be complicated, expensive to perform and not ideal for the simultaneous extraction of analytes with different solubilities. Extraction by multi-immunoaffinity chromatography (MIAC) may be used as an alternative. Samples are passed through a column containing a range of antibodies immobilized on an inert support. The desired analytes are bound to their respective antibodies, washed and then eluted by a suitable solvent. The purified extracts can then be incorporated into the analytical tests, The analytes that can be extracted presently are alpha-nortestosterone, zeranol, trenbolone, diethylstilboestrol, boldenone and dexamethasone. Manually, the MIAC procedure is limited to about six columns per operator but bq automating the process using a robotic sample processor (RSP), 48 columns can be run simultaneously during the day or night. The RSP has also been adapted to transfer extracts and reagents on to ELISA plates. The automated system has proved to be a robust and reliable means of screening large numbers of samples for anabolic agents with minimal manual input

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Central Library of Cochin University of Science and Technology (CUSAT) has been automated by proprietary software (Adlib Library) since 2000. After 11 years, in 2011, the university authorities decided to shift to an open source software (OSS), for integrated library management system (ILMS), Koha for automating the library housekeeping operations. In this context, this study attempts to share the experiences in cataloging with both type of software. The features of the cataloging modules of both the software are analysed on the badis of certain check points. It is found that the cataloging module of Koha is almost in par with that of proven proprietary software that has been in market for the past 25 years. Some suggestions made by this study may be incorporated for the further development and perfection of Koha.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yeasts are becoming a common cause of nosocomial fungal infections that affect immunocompromised patients. Such infections can evolve into sepsis, whose mortality rate is high. This study aimed to evaluate the viability of Candida species identification by the automated system Vitek-Biomerieux (Durham, USA). Ninety-eight medical charts referencing the Candida spp. samples available for the study were retrospectively analyzed. The system Vitek-Biomerieux with Candida identification card is recommended for laboratory routine use and presents 80.6% agreement with the reference method. By separate analysis of species, 13.5% of C. parapsilosis samples differed from the reference method, while the Vitek system wrongly identified them as C. tropicalis, C. lusitaneae or as Candida albicans. C. glabrata presented a discrepancy of only one sample (25%), and was identified by Vitek as C. parapsilosis. C. guilliermondii also differed in only one sample (33.3%), being identified as Candida spp. All C. albicans, C. tropicalis and C. lusitaneae samples were identified correctly.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of nitrification inhibitors, in combination with ammonium based fertilisers, has been promoted recently as an effective method to reduce nitrous oxide (N2O) emissions from fertilised agricultural fields, whilst increasing yield and nitrogen use efficiency. Vegetable cropping systems are often characterised by high inputs of nitrogen fertiliser and consequently elevated emissions of nitrous oxide (N2O) can be expected. However, to date only limited data is available on the use of nitrification inhibitors in sub-tropical vegetable systems. A field experiment investigated the effect of the nitrification inhibitors (DMPP & 3MP+TZ) on N2O emissions and yield from a typical vegetable production system in sub-tropical Australia. Soil N2O fluxes were monitored continuously over an entire year with a fully automated system. Measurements were taken from three subplots for each treatment within a randomized complete blocks design. There was a significant inhibition effect of DMPP and 3MP+TZ on N2O emissions and soil mineral N content directly following the application of the fertiliser over the vegetable cropping phase. However this mitigation was offset by elevated N2O emissions from the inhibitor treatments over the post-harvest fallow period. Cumulative annual N2O emissions amounted to 1.22 kg-N/ha, 1.16 kg-N/ha, 1.50 kg-N/ha and 0.86 kg-N/ha in the conventional fertiliser (CONV), the DMPP treatment, the 3MP+TZ treatment and the zero fertiliser (0N) respectively. Corresponding fertiliser induced emission factors (EFs) were low with only 0.09 - 0.20% of the total applied fertiliser lost as N2O. There was no significant effect of the nitrification inhibitors on yield compared to the CONV treatment for the three vegetable crops (green beans, broccoli, lettuce) grown over the experimental period. This study highlights that N2O emissions from such vegetable cropping system are primarily controlled by post-harvest emissions following the incorporation of vegetable crop residues into the soil. It also shows that the use of nitrification inhibitors can lead to elevated N2O emissions by storing N in the soil profile that is available to soil microbes during the decomposition of the vegetable residues over the post-harvest phase. Hence the use of nitrification inhibitors in vegetable systems has to be treated carefully and fertiliser rates need to be adjusted to avoid excess soil nitrogen during the postharvest phase.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In order to meet the ever growing demand for the prediction of oceanographic parametres in the Indian Ocean for a variety of applications, the Indian National Centre for Ocean Information Services (INCOIS) has recently set-up an operational ocean forecast system, viz. the Indian Ocean Forecast System (INDOFOS). This fully automated system, based on a state-of-the-art ocean general circulation model issues six-hourly forecasts of the sea-surface temperature, surface currents and depths of the mixed layer and the thermocline up to five-days of lead time. A brief account of INDOFOS and a statistical validation of the forecasts of these parametres using in situ and remote sensing data are presented in this article. The accuracy of the sea-surface temperature forecasts by the system is high in the Bay of Bengal and the Arabian Sea, whereas it is moderate in the equatorial Indian Ocean. On the other hand, the accuracy of the depth of the thermocline and the isothermal layers and surface current forecasts are higher near the equatorial region, while it is relatively lower in the Bay of Bengal.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Listener is an automated system that unintrusively performs knowledge acquisition from informal input. The Listener develops a coherent internal representation of a description from an initial set of disorganized, imprecise, incomplete, ambiguous, and possibly inconsistent statements. The Listener can produce a summary document from its internal representation to facilitate communication, review, and validation. A special purpose Listener, called the Requirements Apprentice (RA), has been implemented in the software requirements acquisition domain. Unlike most other requirements analysis tools, which start from a formal description language, the focus of the RA is on the transition between informal and formal specifications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Classifying novel terrain or objects front sparse, complex data may require the resolution of conflicting information from sensors working at different times, locations, and scales, and from sources with different goals and situations. Information fusion methods can help resolve inconsistencies, as when evidence variously suggests that an object's class is car, truck, or airplane. The methods described here consider a complementary problem, supposing that information from sensors and experts is reliable though inconsistent, as when evidence suggests that an object's class is car, vehicle, and man-made. Underlying relationships among objects are assumed to be unknown to the automated system or the human user. The ARTMAP information fusion system used distributed code representations that exploit the neural network's capacity for one-to-many learning in order to produce self-organizing expert systems that discover hierarchical knowledge structures. The system infers multi-level relationships among groups of output classes, without any supervised labeling of these relationships.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traditional motion capture techniques, for instance, those employing optical technology, have long been used in the area of rehabilitation, sports medicine and performance analysis, where accurately capturing bio-mechanical data is of crucial importance. However their size, cost, complexity and lack of portability mean that their use is often impractical. Low cost MEMS inertial sensors when combined and assembled into a Wireless Inertial Measurement Unit (WIMU) present a possible solution for low cost and highly portable motion capture. However due to the large variability inherent to MEMS sensors, such a system would need extensive characterization to calibrate each sensor and ensure good quality data capture. A completely calibrated WIMU system would allow for motion capture in a wider range of real-world, non-laboratory based applications. Calibration can be a complex task, particularly for newer, multi-sensing range capable inertial sensors. As such we present an automated system for quickly and easily calibrating inertial sensors in a packaged WIMU, demonstrating some of the improvements in accuracy attainable.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective
Preliminary assessment of an automated weaning system (SmartCare™/PS) compared to usual management of weaning from mechanical ventilation performed in the absence of formal protocols.


Design and setting
A randomised, controlled pilot study in one Australian intensive care unit.


Patients
A total of 102 patients were equally divided between SmartCare/PS and Control.

Interventions
The automated system titrated pressure support, conducted a spontaneous breathing trial and provided notification of success (“separation potential”).

Measurements and results
The median time from the first identified point of suitability for weaning commencement to the state of “separation potential” using SmartCare/PS was 20 h (interquartile range, IQR, 2–40) compared to 8 h (IQR 2–43) with Control (log-rank P = 0.3). The median time to successful extubation was 43 h (IQR 6–169) using SmartCare/PS and 40 (14–87) with Control (log-rank P = 0.6). Unadjusted, the estimated probability of reaching “separation potential” was 21% lower (95% CI, 48% lower to 20% greater) with SmartCare/PS compared to Control. Adjusted for other covariates (age, gender, APACHE II, SOFAmax, neuromuscular blockade, corticosteroids, coma and elevated blood glucose), these estimates were 31% lower (95% CI, 56% lower to 9% greater) with SmartCare/PS. The study groups showed comparable rates of reintubation, non-invasive ventilation post-extubation, tracheostomy, sedation, neuromuscular blockade and use of corticosteroids.

Conclusions
Substantial reductions in weaning duration previously demonstrated were not confirmed when the SmartCare/PS system was compared to weaning managed by experienced critical care specialty nurses, using a 1:1 nurse-to-patient ratio. The effect of SmartCare/PS may be influenced by the local clinical organisational context.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The development of an automated system for the quality assessment of aerodrome ground lighting (AGL), in accordance with associated standards and recommendations, is presented. The system is composed of an image sensor, placed inside the cockpit of an aircraft to record images of the AGL during a normal descent to an aerodrome. A model-based methodology is used to ascertain the optimum match between a template of the AGL and the actual image data in order to calculate the position and orientation of the camera at the instant the image was acquired. The camera position and orientation data are used along with the pixel grey level for each imaged luminaire, to estimate a value for the luminous intensity of a given luminaire. This can then be compared with the expected brightness for that luminaire to ensure it is operating to the required standards. As such, a metric for the quality of the AGL pattern is determined. Experiments on real image data is presented to demonstrate the application and effectiveness of the system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Measures of icon designs rely heavily on surveys of the perceptions of population samples. Thus, measuring the extent to which changes in the structure of an icon will alter its perceived complexity can be costly and slow. An automated system capable of producing reliable estimates of perceived complexity could reduce development costs and time. Measures of icon complexity developed by Garcia, Badre, and Stasko (1994) and McDougall, Curry, and de Bruijn (1999) were correlated with six icon properties measured using Matlab (MathWorks, 2001) software, which uses image-processing techniques to measure icon properties. The six icon properties measured were icon foreground, the number of objects in an icon, the number of holes in those objects, and two calculations of icon edges and homogeneity in icon structure. The strongest correlates with human judgments of perceived icon complexity (McDougall et al., 1999) were structural variability (r(s) = .65) and edge information (r(s) =.64).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Temperature, pressure, gas stoichiometry, and residence time were varied to control the yield and product distribution of the palladium-catalyzed aminocarbonylation of aromatic bromides in both a silicon microreactor and a packed-bed tubular reactor. Automation of the system set points and product sampling enabled facile and repeatable reaction analysis with minimal operator supervision. It was observed that the reaction was divided into two temperature regimes. An automated system was used to screen steady-state conditions for offline analysis by gas chromatography to fit a reaction rate model. Additionally, a transient temperature ramp method utilizing online infrared analysis was used, leading to more rapid determination of the reaction activation energy of the lower temperature regimes. The entire reaction spanning both regimes was modeled in good agreement with the experimental data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Automated negotiation systems can do better than human being in many aspects, and thus are applied into many domains ranging from business to computer science. However, little work about automating negotiation of complex business contract has been done so far although it is a kind of the most important negotiation in business. In order to address this issue, in this paper we developed an automated system for this kind of negotiation. This system is based on the principled negotiation theory, which is the most effective method of negotiation in the domain of business. The system is developed as a knowledge-based one because a negotiating agent in business has to be economically intelligent and capable of making effective decisions based on business experiences and knowledge. Finally, the validity of the developed system is shown in a real negotiation scenario where on behalf of human users, the system successfully performed a negotiation of a complex business contract between a wholesaler and a retailer. © 2013 Springer-Verlag Berlin Heidelberg.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Communication has become an essential function in our civilization. With the increasing demand for communication channels, it is now necessary to find ways to optimize the use of their bandwidth. One way to achieve this is by transforming the information before it is transmitted. This transformation can be performed by several techniques. One of the newest of these techniques is the use of wavelets. Wavelet transformation refers to the act of breaking down a signal into components called details and trends by using small waveforms that have a zero average in the time domain. After this transformation the data can be compressed by discarding the details, transmitting the trends. In the receiving end, the trends are used to reconstruct the image. In this work, the wavelet used for the transformation of an image will be selected from a library of available bases. The accuracy of the reconstruction, after the details are discarded, is dependent on the wavelets chosen from the wavelet basis library. The system developed in this thesis takes a 2-D image and decomposes it using a wavelet bank. A digital signal processor is used to achieve near real-time performance in this transformation task. A contribution of this thesis project is the development of DSP-based test bed for the future development of new real-time wavelet transformation algorithms.