988 resultados para Handling time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

An automatic email handling system (AutoRouter) was introduced at a national counselling service in Australia. In 2003, counsellors responded to a total of 7421 email messages. Over nine days in early May 2004 the administrator responsible for the management of the manual email counselling service recorded the time spent on managing email messages. The AutoRouter was then introduced. Since the implementation of the AutoRouter the administrator's management role has become redundant, an average of 12 h 5 min per week of staff time has been saved. There have been further savings in supervisor time. Counsellors were taking an average of 6.2 days to respond to email messages (n=4307), with an average delay of 1.2 days from the time counsellors wrote the email to when the email was sent. Thus the response was sent on average 7.4 days after receipt of the original client email message. A significant decrease in response time has been noted since implementation of the AutoRouter, with client responses now taking an average of 5.4 days, a decrease of 2.0 days. Automatic message handling appears to be a promising method of managing the administration of a steadily increasing email counselling service.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe a novel, extensible visualization system currently under development at Aston University. We introduce modern programming methods, such as the use of data driven programming, design patterns, and the careful definition of interfaces to allow easy extension using plug-ins, to 3D landscape visualization software. We combine this with modern developments in computer graphics, such as vertex and fragment shaders, to create an extremely flexible, extensible real-time near photorealistic visualization system. In this paper we show the design of the system and the main sub-components. We stress the role of modern programming practices and illustrate the benefits these bring to 3D visualization. © 2006 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Lyophilisation or freeze drying is the preferred dehydrating method for pharmaceuticals liable to thermal degradation. Most biologics are unstable in aqueous solution and may use freeze drying to prolong their shelf life. Lyophilisation is however expensive and has seen lots of work aimed at reducing cost. This thesis is motivated by the potential cost savings foreseen with the adoption of a cost efficient bulk drying approach for large and small molecules. Initial studies identified ideal formulations that adapted well to bulk drying and further powder handling requirements downstream in production. Low cost techniques were used to disrupt large dried cakes into powder while the effects of carrier agent concentration were investigated for powder flowability using standard pharmacopoeia methods. This revealed superiority of crystalline mannitol over amorphous sucrose matrices and established that the cohesive and very poor flow nature of freeze dried powders were potential barriers to success. Studies from powder characterisation showed increased powder densification was mainly responsible for significant improvements in flow behaviour and an initial bulking agent concentration of 10-15 %w/v was recommended. Further optimisation studies evaluated the effects of freezing rates and thermal treatment on powder flow behaviour. Slow cooling (0.2 °C/min) with a -25°C annealing hold (2hrs) provided adequate mechanical strength and densification at 0.5-1 M mannitol concentrations. Stable bulk powders require powder transfer into either final vials or intermediate storage closures. The targeted dosing of powder formulations using volumetric and gravimetric powder dispensing systems where evaluated using Immunoglobulin G (IgG), Lactate Dehydrogenase (LDH) and Beta Galactosidase models. Final protein content uniformity in dosed vials was assessed using activity and protein recovery assays to draw conclusions from deviations and pharmacopeia acceptance values. A correlation between very poor flowability (p<0.05), solute concentration, dosing time and accuracy was revealed. LDH and IgG lyophilised in 0.5 M and 1 M mannitol passed Pharmacopeia acceptance values criteria with 0.1-4 while formulations with micro collapse showed the best dose accuracy (0.32-0.4% deviation). Bulk mannitol content above 0.5 M provided no additional benefits to dosing accuracy or content uniformity of dosed units. This study identified considerations which included the type of protein, annealing, cake disruption process, physical form of the phases present, humidity control and recommended gravimetric transfer as optimal for dispensing powder. Dosing lyophilised powders from bulk was demonstrated as practical, time efficient, economical and met regulatory requirements in cases. Finally the use of a new non-destructive technique, X-ray microcomputer tomography (MCT), was explored for cake and particle characterisation. Studies demonstrated good correlation with traditional gas porosimetry (R2 = 0.93) and morphology studies using microscopy. Flow characterisation from sample sizes of less than 1 mL was demonstrated using three dimensional X-ray quantitative image analyses. A platinum-mannitol dispersion model used revealed a relationship between freezing rate, ice nucleation sites and variations in homogeneity within the top to bottom segments of a formulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deciphering the driving mechanisms of Earth system processes, including the climate dynamics expressed as paleoceanographic events, requires a complete, continuous, and high-resolution stratigraphy that is very accurately dated. In this study, we construct a robust astronomically calibrated age model for the middle Eocene to early Oligocene interval (31-43 Ma) in order to permit more detailed study of the exceptional climatic events that occurred during this time, including the Middle Eocene Climate Optimum and the Eocene/Oligocene transition. A goal of this effort is to accurately date the middle Eocene to early Oligocene composite section cored during the Pacific Equatorial Age Transect (PEAT, IODP Exp. 320/321). The stratigraphic framework for the new time scale is based on the identification of the stable long eccentricity cycle in published and new high-resolution records encompassing bulk and benthic stable isotope, calibrated XRF core scanning, and magnetostratigraphic data from ODP Sites 171B-1052, 189-1172, 199-1218, and 207-1260 as well as IODP Sites 320-U1333, and -U1334 spanning magnetic polarity Chrons C12n to C20n. Subsequently we applied orbital tuning of the records to the La2011 orbital solution. The resulting new time scale revises and refines the existing orbitally tuned age model and the Geomagnetic Polarity Time Scale from 31 to 43 Ma. Our newly defined absolute age for the Eocene/Oligocene boundary validates the astronomical tuned age of 33.89 Ma identified at the Massignano (Italy) global stratotype section and point. Our compilation of geochemical records of climate-controlled variability in sedimentation through the middle-to-late Eocene and early Oligocene demonstrates strong power in the eccentricity band that is readily tuned to the latest astronomical solution. Obliquity driven cyclicity is only apparent during very long eccentricity cycle minima around 35.5 Ma, 38.3 Ma and 40.1 Ma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Moderate-to-vigorous physical activity (MVPA) is an important determinant of children’s physical health, and is commonly measured using accelerometers. A major limitation of accelerometers is non-wear time, which is the time the participant did not wear their device. Given that non-wear time is traditionally discarded from the dataset prior to estimating MVPA, final estimates of MVPA may be biased. Therefore, alternate approaches should be explored. OBJECTIVES: The objectives of this thesis were to 1) develop and describe an imputation approach that uses the socio-demographic, time, health, and behavioural data from participants to replace non-wear time accelerometer data, 2) determine the extent to which imputation of non-wear time data influences estimates of MVPA, and 3) determine if imputation of non-wear time data influences the associations between MVPA, body mass index (BMI), and systolic blood pressure (SBP). METHODS: Seven days of accelerometer data were collected using Actical accelerometers from 332 children aged 10-13. Three methods for handling missing accelerometer data were compared: 1) the “non-imputed” method wherein non-wear time was deleted from the dataset, 2) imputation dataset I, wherein the imputation of MVPA during non-wear time was based upon socio-demographic factors of the participant (e.g., age), health information (e.g., BMI), and time characteristics of the non-wear period (e.g., season), and 3) imputation dataset II wherein the imputation of MVPA was based upon the same variables as imputation dataset I, plus organized sport information. Associations between MVPA and health outcomes in each method were assessed using linear regression. RESULTS: Non-wear time accounted for 7.5% of epochs during waking hours. The average minutes/day of MVPA was 56.8 (95% CI: 54.2, 59.5) in the non-imputed dataset, 58.4 (95% CI: 55.8, 61.0) in imputed dataset I, and 59.0 (95% CI: 56.3, 61.5) in imputed dataset II. Estimates between datasets were not significantly different. The strength of the relationship between MVPA with BMI and SBP were comparable between all three datasets. CONCLUSION: These findings suggest that studies that achieve high accelerometer compliance with unsystematic patterns of missing data can use the traditional approach of deleting non-wear time from the dataset to obtain MVPA measures without substantial bias.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a solution to part of the problem of making robotic or semi-robotic digging equipment less dependant on human supervision. A method is described for identifying rocks of a certain size that may affect digging efficiency or require special handling. The process involves three main steps. First, by using range and intensity data from a time-of-flight (TOF) camera, a feature descriptor is used to rank points and separate regions surrounding high scoring points. This allows a wide range of rocks to be recognized because features can represent a whole or just part of a rock. Second, these points are filtered to extract only points thought to belong to the large object. Finally, a check is carried out to verify that the resultant point cloud actually represents a rock. Results are presented from field testing on piles of fragmented rock. Note to Practitioners—This paper presents an algorithm to identify large boulders in a pile of broken rock as a step towards an autonomous mining dig planner. In mining, piles of broken rock can contain large fragments that may need to be specially handled. To assess rock piles for excavation, we make use of a TOF camera that does not rely on external lighting to generate a point cloud of the rock pile. We then segment large boulders from its surface by using a novel feature descriptor and distinguish between real and false boulder candidates. Preliminary field experiments show promising results with the algorithm performing nearly as well as human test subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Real-time data of key performance enablers in logistics warehouses are of growing importance as they permit decision-makers to instantaneously react to alerts, deviations and damages. Several technologies appear as adequate data sources to collect the information required in order to achieve the goal. In the present re-search paper, the load status of the fork of a forklift is to be recognized with the help of a sensor-based and a camera-based solution approach. The comparison of initial experimentation results yields a statement about which direction to pursue for promising further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Survival of seal pups may be affected by their ability to respond appropriately to stress. Chronic stress can adversely affect secretion of cortisol and thyroid hormones, which contribute to the control of fuel utilisation. Repeated handling could disrupt the endocrine response to stress and/or negatively impact upon mass changes during fasting. Here we investigated the effects of handling regime on cortisol and thyroid hormone levels, and body mass changes, in fasting male and female grey seal pups (Halichoerus grypus). Females had higher thyroid hormone levels than males throughout fasting and showed a reduction in cortisol midway through the fast that was not seen in males. This may reflect sex-specific fuel allocation or development. Neither handling frequency nor cumulative contact time affected plasma cortisol or thyroid hormone levels, the rate of increase in cortisol over the first five minutes of physical contact or the pattern of mass loss during fasting in either sex. The endocrine response to stress and the control of energy balance in grey seal pups appear to be robust to repeated, short periods of handling. Our results suggest that routine handling should have no additional impact on these animals than general disturbance caused by researchers moving around the colony.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The handling and processing of fish in Uganda has until recently been carried out exclusively by the artisanal fishermen and fish processors. Their operations have left much to be desired as the product is often of low quality 'and its keeping time is limited. The handling of fresh fish has been without refrigeration but with the recent establishment of commercial fish processing plants a cold chain of fish distribution is being set up for domestic and export markets. Some of the fishermen are beginning to ice their catch immediately after reaching the shore. It is hoped that fishmongers will increasingly find it more profitable to market their products iced. This will make fish available to a large sector of the population and in the process there will be reduced post-harvest losses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Improper handling has been identified as one of the major reasons for the decline in vaccine potency at the time of administration. Loss of potency becomes evident when immunised individuals contract the diseases the vaccines were meant to prevent. Objective: Assessing the factors associated with vaccine handling and storage practices. Methods: This was a cross-sectional study. Three-stage sampling was used to recruit 380 vaccine handlers from 273 health facilities from 11 Local Government areas in Ibadan. Data was analysed using SPSS version 16 Results: Seventy-three percent were aware of vaccine handling and storage guidelines with 68.4% having ever read such guidelines. Only 15.3% read a guideline less than 1 month prior to the study. About 65.0% had received training on vaccine management. Incorrect handling practices reported included storing injections with vaccines (13.7%) and maintaining vaccine temperature using ice blocks (7.6%). About 43.0% had good knowledge of vaccine management, while 66.1% had good vaccine management practices. Respondents who had good knowledge of vaccine handling and storage [OR=10.0, 95%CI (5.28 – 18.94), p < 0.001] and had received formal training on vaccine management [OR=5.3, 95%CI (2.50 – 11.14), p< 0.001] were more likely to have good vaccine handling and storage practices. Conclusion: Regular training is recommended to enhance vaccine handling and storage practices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we further extend the recently developed adaptive data analysis method, the Sparse Time-Frequency Representation (STFR) method. This method is based on the assumption that many physical signals inherently contain AM-FM representations. We propose a sparse optimization method to extract the AM-FM representations of such signals. We prove the convergence of the method for periodic signals under certain assumptions and provide practical algorithms specifically for the non-periodic STFR, which extends the method to tackle problems that former STFR methods could not handle, including stability to noise and non-periodic data analysis. This is a significant improvement since many adaptive and non-adaptive signal processing methods are not fully capable of handling non-periodic signals. Moreover, we propose a new STFR algorithm to study intrawave signals with strong frequency modulation and analyze the convergence of this new algorithm for periodic signals. Such signals have previously remained a bottleneck for all signal processing methods. Furthermore, we propose a modified version of STFR that facilitates the extraction of intrawaves that have overlaping frequency content. We show that the STFR methods can be applied to the realm of dynamical systems and cardiovascular signals. In particular, we present a simplified and modified version of the STFR algorithm that is potentially useful for the diagnosis of some cardiovascular diseases. We further explain some preliminary work on the nature of Intrinsic Mode Functions (IMFs) and how they can have different representations in different phase coordinates. This analysis shows that the uncertainty principle is fundamental to all oscillating signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At the HL-LHC, proton bunches will cross each other every 25. ns, producing an average of 140 pp-collisions per bunch crossing. To operate in such an environment, the CMS experiment will need a L1 hardware trigger able to identify interesting events within a latency of 12.5. μs. The future L1 trigger will make use also of data coming from the silicon tracker to control the trigger rate. The architecture that will be used in future to process tracker data is still under discussion. One interesting proposal makes use of the Time Multiplexed Trigger concept, already implemented in the CMS calorimeter trigger for the Phase I trigger upgrade. The proposed track finding algorithm is based on the Hough Transform method. The algorithm has been tested using simulated pp-collision data. Results show a very good tracking efficiency. The algorithm will be demonstrated in hardware in the coming months using the MP7, which is a μTCA board with a powerful FPGA capable of handling data rates approaching 1. Tb/s.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visando obter subsídios para um estudo da qualidade da água em tanques de piscicultura, realizou-se um experimento de 166 dias com uma espécie nativa, pacu (Piaractus mesopotamicus). Nos tanques foram testados dois níveis diferentes de proteína na dieta (16% e 34% de proteína bruta) e três densidades de estocagem (0,25; 0,50 e 0,77 peixes/m²). Dos resultados obtidos foi observado que a interação entre a densidade de estocagem e a duração do experimento interferiram nas variáveis bicarbonato e alcalinidade e a interação entre a densidade de estocagem e a porcentagem de proteína interferiram nas concentrações de CO2 livre e total, condutividade e pH (P < 0,05). A temperatura da água nos tanques variou significativamente ao longo do período estudado (P < 0,05), diminuindo gradativamente do verão para o inverno. Não houve diferença significativa no tempo de residência da água nos tanques (P > 0,05) durante a duração do experimento. Os demais parâmetros não sofreram interferência dos tratamentos ao longo do período de estudo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, we have witnessed substantial exploitation of real-time streaming applications, such as video surveillance system on road crosses of a city. So far, real world applications mainly rely on the traditional well-known client-server and peer-to-peer schemes as the fundamental mechanism for communication. However, due to the limited resources on each terminal device in the applications, these two schemes cannot well leverage the processing capability between the source and destination of the video traffic, which leads to limited streaming services. For this reason, many QoS sensitive application cannot be supported in the real world. In this paper, we are motivated to address this problem by proposing a novel multi-server based framework. In this framework, multiple servers collaborate with each other to form a virtual server (also called cloud-server), and provide high-quality services such as real-time streams delivery and storage. Based on this framework, we further introduce a (1-?) approximation algorithm to solve the NP-complete "maximum services"(MS) problem with the intention of handling large number of streaming flows originated by networks and maximizing the total number of services. Moreover, in order to backup the streaming data for later retrieval, based on the framework, an algorithm is proposed to implement backups and maximize streaming flows simultaneously. We conduct a series of experiments based on simulations to evaluate the performance of the newly proposed framework. We also compare our scheme to several traditional solutions. The results suggest that our proposed scheme significantly outperforms the traditional solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to the high standards expected from diagnostic medical imaging, the analysis of information regarding waiting lists via different information systems is of utmost importance. Such analysis, on the one hand, may improve the diagnostic quality and, on the other hand, may lead to the reduction of waiting times, with the concomitant increase of the quality of services and the reduction of the inherent financial costs. Hence, the purpose of this study is to assess the waiting time in the delivery of diagnostic medical imaging services, like computed tomography and magnetic resonance imaging. Thereby, this work is focused on the development of a decision support system to assess waiting times in diagnostic medical imaging with recourse to operational data of selected attributes extracted from distinct information systems. The computational framework is built on top of a Logic Programming Case-base Reasoning approach to Knowledge Representation and Reasoning that caters for the handling of in-complete, unknown, or even self-contradictory information.