834 resultados para work time tracking
Resumo:
Garment information tracking is required for clean room garment management. In this paper, we present a camera-based robust system with implementation of Optical Character Reconition (OCR) techniques to fulfill garment label recognition. In the system, a camera is used for image capturing; an adaptive thresholding algorithm is employed to generate binary images; Connected Component Labelling (CCL) is then adopted for object detection in the binary image as a part of finding the ROI (Region of Interest); Artificial Neural Networks (ANNs) with the BP (Back Propagation) learning algorithm are used for digit recognition; and finally the system is verified by a system database. The system has been tested. The results show that it is capable of coping with variance of lighting, digit twisting, background complexity, and font orientations. The system performance with association to the digit recognition rate has met the design requirement. It has achieved real-time and error-free garment information tracking during the testing.
Resumo:
This case study uses log-linear modelling to investigate the interrelationships between factors that may contribute to the late submission of coursework by undergraduate students. A class of 86 computing students are considered. These students were exposed to traditional teaching methods supported by e-learning via a Managed Learning Environment (MLE). The MLE warehouses detailed data about student usage of the various areas of the environment, which can be used to interpret the approach taken to learning. The study investigates the interrelationship between these factors with the information as to whether the student handed in their course work on time or whether they were late. The results from the log-linear modelling technique show that there is an interaction between participating in Discussions within the MLE and the timely submission of course work, indicating that participants are more likely to hand in on time, than those students who do not participate.
Resumo:
For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments. Copyright (C) 2009 John Wiley & Sons, Ltd.
Resumo:
We introduce a classification-based approach to finding occluding texture boundaries. The classifier is composed of a set of weak learners, which operate on image intensity discriminative features that are defined on small patches and are fast to compute. A database that is designed to simulate digitized occluding contours of textured objects in natural images is used to train the weak learners. The trained classifier score is then used to obtain a probabilistic model for the presence of texture transitions, which can readily be used for line search texture boundary detection in the direction normal to an initial boundary estimate. This method is fast and therefore suitable for real-time and interactive applications. It works as a robust estimator, which requires a ribbon-like search region and can handle complex texture structures without requiring a large number of observations. We demonstrate results both in the context of interactive 2D delineation and of fast 3D tracking and compare its performance with other existing methods for line search boundary detection.
Resumo:
This paper presents a new strategy for controlling rigid-robot manipulators in the presence of parametric uncertainties or un-modelled dynamics. The strategy combines an adaptation law with a well known robust controller proposed by Spong, which is derived using Lyapunov's direct method. Although the tracking problem of manipulators has been successfully solved with different strategies, there are some conditions under which their efficiency is limited. Specifically, their performance decreases when unknown loading masses or model disturbances are introduced. The aim of this work is to show that the proposed strategy performs better than existing algorithms, as verified with real-time experimental results with a Puma-560 robot. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
In the UK, participation in higher education has risen over the past two decades, along with a shift of the costs of higher education onto the individual and a move to widening participation among previously underrepresented groups. This has led to changes in the way individuals fund their higher education, in particular a rise in the incidence of term time employment. Term time employment potentially plays a much bigger role than in the past, both as a means for individuals to fund their education and reduce debt, and as a way to gain valuable work experience and increase employability. With the increase in the number of graduates in the UK labour market it is now more important for individuals to be able to differentiate themselves in the labour market.
Resumo:
Dynamic relationships between technologies and organizations are investigated through research on digital visualization technologies and their use in the construction sector. Theoretical work highlights mutual adaptation between technologies and organizations but does not explain instances of sustained, sudden, or increasing maladaptation. By focusing on the technological field, I draw attention to hierarchical structuring around inter-dependent levels of technology; technological priorities of diverse groups; power asymmetries and disjunctures between contexts of development and use. For complex technologies, such as digital technologies, I argue these field-level features explain why organizations peripheral to the field may experience difficulty using emerging technology.
Resumo:
Bayesian Model Averaging (BMA) is used for testing for multiple break points in univariate series using conjugate normal-gamma priors. This approach can test for the number of structural breaks and produce posterior probabilities for a break at each point in time. Results are averaged over specifications including: stationary; stationary around trend and unit root models, each containing different types and number of breaks and different lag lengths. The procedures are used to test for structural breaks on 14 annual macroeconomic series and 11 natural resource price series. The results indicate that there are structural breaks in all of the natural resource series and most of the macroeconomic series. Many of the series had multiple breaks. Our findings regarding the existence of unit roots, having allowed for structural breaks in the data, are largely consistent with previous work.
Resumo:
The night-time tropospheric chemistry of two stress-induced volatile organic compounds (VOCs), (Z)-pent-2-en-1-ol and pent-1-en-3-ol, has been studied at room temperature. Rate coefficients for reactions of the nitrate radical (NO3) with these pentenols were measured using the discharge-flow technique. Because of the relatively low volatility of these compounds, we employed off-axis continuous-wave cavity-enhanced absorption spectroscopy for detection of NO3 in order to be able to work in pseudo first-order conditions with the pentenols in large excess over NO3. The rate coefficients were determined to be (1.53 +/- 0.23) x 10(-13) and (1.39 +/- 0.19) x 10(-14) cm(3) molecule(-1) s(-1) for reactions of NO3 with (Z)-pent-2-en-1-ol and pent-1-en-3-ol. An attempt to study the kinetics of these reactions with a relative-rate technique, using N2O5 as source of NO3 resulted in significantly higher apparent rate coefficients. Performing relative-rate experiments in known excesses of NO2 allowed us to determine the rate coefficients for the N2O5 reactions to be (5.0 +/- 2.8) x 10(-19) cm(3) molecule(-1) s(-1) for (Z)-pent-2-en-1-ol, and (9.1 +/- 5.8) x 10(-19) cm(3) molecule(-1) s(-1) for pent-1-en-3-ol. We show that these relatively slow reactions can indeed interfere with rate determinations in conventional relative-rate experiments.
Resumo:
Determination of varicella zoster virus (VZV) immunity in healthcare workers without a history of chickenpox is important for identifying those in need of vOka vaccination. Post immunisation, healthcare workers in the UK who work with high risk patients are tested for seroconversion. To assess the performance of the time-resolved fluorescence immunoassay (TRFIA) for the detection of antibody in vaccinated as well as unvaccinated individuals, a cut-off was first calculated. VZV-IgG specific avidity and titres six weeks after the first dose of vaccine were used to identify subjects with pre-existing immunity among a cohort of 110 healthcare workers. Those with high avidity (≥60%) were considered to have previous immunity to VZV and those with low or equivocal avidity (<60%) were considered naive. The former had antibody levels ≥400mIU/mL and latter had levels <400mIU/mL. Comparison of the baseline values of the naive and immune groups allowed the estimation of a TRFIA cut-off value of >130mIU/mL which best discriminated between the two groups and this was confirmed by ROC analysis. Using this value, the sensitivity and specificity of TRFIA cut-off were 90% (95% CI 79-96), and 78% (95% CI 61-90) respectively in this population. A subset of samples tested by the gold standard Fluorescence Antibody to Membrane Antigen (FAMA) test showed 84% (54/64) agreement with TRFIA.
Resumo:
In this paper,the Prony's method is applied to the time-domain waveform data modelling in the presence of noise.The following three problems encountered in this work are studied:(1)determination of the order of waveform;(2)de-termination of numbers of multiple roots;(3)determination of the residues.The methods of solving these problems are given and simulated on the computer.Finally,an output pulse of model PG-10N signal generator and the distorted waveform obtained by transmitting the pulse above mentioned through a piece of coaxial cable are modelled,and satisfactory results are obtained.So the effectiveness of Prony's method in waveform data modelling in the presence of noise is confirmed.
Resumo:
An AHRC funded project titled: Picturing ideas? Visualising and Synthesising Ideas as art (2009-10). Outputs including: 4 exhibitions; 4 publications; 3 papers; 2 largescale backlit digital prints; 1 commissioned print. (See Additional Information) ----ABSTRACT: Utilising the virtuality of digital imagery this practice-led project explored the possibility of the cross-articulation between text and image and the bridging or synthesising potential of the visual affect of ideas. A series of digital images were produced 'picturing' or 'visualising' philosophical ideas derived from the writings of the philosopher Giles Deleuze, as remodellings of pre-existing philosophical ideas; developed through dialogues and consultation with specialists in the fields from which the ideas were drawn (philosophy, psychology, film) as well as artists and theorists concerned with ideas of 'mental imagery' and visualisation. Final images were produced as a synthesis (or combination) of these visualisations and presented in the format of large scale, backlit digital prints at a series of prestigious international exhibitions (see details above). Evaluation took the form of a four page illustrated text in Frieze magazine (August 2009) and three papers delivered at University of Ulster, Goldsmiths College of Art and Loughborough University. The project also included the publication of a catalogue essay (EAST 09) and an illustrated poem (in the Dark Monarch publication). A print version of the image was commissioned by Invisible Exports Gallery, New York and subsequently exhibited in The Devos Art Museum, School of Art & Design at Northern Michigan University and in a publication edited by Cedar Lewisohn for Tate Publishing. The project was funded by an AHRC practice-led grant (17K) and Arts Council of England award (1.5K). The outputs, including high profile, publicly accessible exhibitions, prestigious publications and conference papers ensured the dissemination of the research to a wide range of audiences, including scholars/researchers across the arts and humanities engaged in practice-based and interdisciplinary theoretical work (in particular in the fields of contemporary art and art theory and those working on the integration of art and theory/philosophy/psychology) but also the wider audience for contemporary art.
Resumo:
In 1967 a novel scheme was proposed for controlling processes with large pure time delay (Fellgett et al, 1967) and some of the constituent parts of the scheme were investigated (Swann, 1970; Atkinson et al, 1973). At that time the available computational facilities were inadequate for the scheme to be implemented practically, but with the advent of modern microcomputers the scheme becomes feasible. This paper describes recent work (Mitchell, 1987) in implementing the scheme in a new multi-microprocessor configuration and shows the improved performance it provides compared with conventional three-term controllers.
Resumo:
In this paper, a discrete time dynamic integrated system optimisation and parameter estimation algorithm is applied to the solution of the nonlinear tracking optimal control problem. A version of the algorithm with a linear-quadratic model-based problem is developed and implemented in software. The algorithm implemented is tested with simulation examples.
Resumo:
In response to increasing atmospheric con- centrations of greenhouse gases, the rate of time- dependent climate change is determined jointly by the strength of climate feedbacks and the e�ciency of pro- cesses which remove heat from the surface into the deep ocean. This work examines the vertical heat transport processes in the ocean of the HADCM2 atmosphere± ocean general circulation model (AOGCM) in experi- ments with CO2 held constant (control) and increasing at 1% per year (anomaly). The control experiment shows that global average heat exchanges between the upper and lower ocean are dominated by the Southern Ocean, where heat is pumped downwards by the wind- driven circulation and di�uses upwards along sloping isopycnals. This is the reverse of the low-latitude balance used in upwelling±di�usion ocean models, the global average upward di�usive transport being against the temperature gradient. In the anomaly experiment, weakened convection at high latitudes leads to reduced diffusive and convective heat loss from the deep ocean, and hence to net heat uptake, since the advective heat input is less a�ected. Reduction of deep water produc- tion at high latitudes results in reduced upwelling of cold water at low latitudes, giving a further contribution to net heat uptake. On the global average, high-latitude processes thus have a controlling in¯uence. The impor- tant role of di�usion highlights the need to ensure that the schemes employed in AOGCMs give an accurate representation of the relevant sub-grid-scale processes.