76 resultados para work time tracking
Resumo:
An AHRC funded project titled: Picturing ideas? Visualising and Synthesising Ideas as art (2009-10). Outputs including: 4 exhibitions; 4 publications; 3 papers; 2 largescale backlit digital prints; 1 commissioned print. (See Additional Information) ----ABSTRACT: Utilising the virtuality of digital imagery this practice-led project explored the possibility of the cross-articulation between text and image and the bridging or synthesising potential of the visual affect of ideas. A series of digital images were produced 'picturing' or 'visualising' philosophical ideas derived from the writings of the philosopher Giles Deleuze, as remodellings of pre-existing philosophical ideas; developed through dialogues and consultation with specialists in the fields from which the ideas were drawn (philosophy, psychology, film) as well as artists and theorists concerned with ideas of 'mental imagery' and visualisation. Final images were produced as a synthesis (or combination) of these visualisations and presented in the format of large scale, backlit digital prints at a series of prestigious international exhibitions (see details above). Evaluation took the form of a four page illustrated text in Frieze magazine (August 2009) and three papers delivered at University of Ulster, Goldsmiths College of Art and Loughborough University. The project also included the publication of a catalogue essay (EAST 09) and an illustrated poem (in the Dark Monarch publication). A print version of the image was commissioned by Invisible Exports Gallery, New York and subsequently exhibited in The Devos Art Museum, School of Art & Design at Northern Michigan University and in a publication edited by Cedar Lewisohn for Tate Publishing. The project was funded by an AHRC practice-led grant (17K) and Arts Council of England award (1.5K). The outputs, including high profile, publicly accessible exhibitions, prestigious publications and conference papers ensured the dissemination of the research to a wide range of audiences, including scholars/researchers across the arts and humanities engaged in practice-based and interdisciplinary theoretical work (in particular in the fields of contemporary art and art theory and those working on the integration of art and theory/philosophy/psychology) but also the wider audience for contemporary art.
Resumo:
In 1967 a novel scheme was proposed for controlling processes with large pure time delay (Fellgett et al, 1967) and some of the constituent parts of the scheme were investigated (Swann, 1970; Atkinson et al, 1973). At that time the available computational facilities were inadequate for the scheme to be implemented practically, but with the advent of modern microcomputers the scheme becomes feasible. This paper describes recent work (Mitchell, 1987) in implementing the scheme in a new multi-microprocessor configuration and shows the improved performance it provides compared with conventional three-term controllers.
Resumo:
In this paper, a discrete time dynamic integrated system optimisation and parameter estimation algorithm is applied to the solution of the nonlinear tracking optimal control problem. A version of the algorithm with a linear-quadratic model-based problem is developed and implemented in software. The algorithm implemented is tested with simulation examples.
Resumo:
In response to increasing atmospheric con- centrations of greenhouse gases, the rate of time- dependent climate change is determined jointly by the strength of climate feedbacks and the e�ciency of pro- cesses which remove heat from the surface into the deep ocean. This work examines the vertical heat transport processes in the ocean of the HADCM2 atmosphere± ocean general circulation model (AOGCM) in experi- ments with CO2 held constant (control) and increasing at 1% per year (anomaly). The control experiment shows that global average heat exchanges between the upper and lower ocean are dominated by the Southern Ocean, where heat is pumped downwards by the wind- driven circulation and di�uses upwards along sloping isopycnals. This is the reverse of the low-latitude balance used in upwelling±di�usion ocean models, the global average upward di�usive transport being against the temperature gradient. In the anomaly experiment, weakened convection at high latitudes leads to reduced diffusive and convective heat loss from the deep ocean, and hence to net heat uptake, since the advective heat input is less a�ected. Reduction of deep water produc- tion at high latitudes results in reduced upwelling of cold water at low latitudes, giving a further contribution to net heat uptake. On the global average, high-latitude processes thus have a controlling in¯uence. The impor- tant role of di�usion highlights the need to ensure that the schemes employed in AOGCMs give an accurate representation of the relevant sub-grid-scale processes.
Resumo:
This work provides a framework for the approximation of a dynamic system of the form x˙=f(x)+g(x)u by dynamic recurrent neural network. This extends previous work in which approximate realisation of autonomous dynamic systems was proven. Given certain conditions, the first p output neural units of a dynamic n-dimensional neural model approximate at a desired proximity a p-dimensional dynamic system with n>p. The neural architecture studied is then successfully implemented in a nonlinear multivariable system identification case study.
Resumo:
PV only generates electricity during daylight hours and primarily generates over summer. In the UK, the carbon intensity of grid electricity is higher during the daytime and over winter. This work investigates whether the grid electricity displaced by PV is high or low carbon compared to the annual mean carbon intensity using carbon factors at higher temporal resolutions (half-hourly and daily). UK policy for carbon reporting requires savings to be calculated using the annual mean carbon intensity of grid electricity. This work offers an insight into whether this technique is appropriate. Using half hourly data on the generating plant supplying the grid from November 2008 to May 2010, carbon factors for grid electricity at half-hourly and daily resolution have been derived using technology specific generation emission factors. Applying these factors to generation data from PV systems installed on schools, it is possible to assess the variation in the carbon savings from displacing grid electricity with PV generation using carbon factors with different time resolutions. The data has been analyzed for a period of 363 to 370 days and so cannot account for inter-year variations in the relationship between PV generation and carbon intensity of the electricity grid. This analysis suggests that PV displaces more carbon intensive electricity using half-hourly carbon factors than using daily factors but less compared with annual ones. A similar methodology could provide useful insights on other variable renewable and demand-side technologies and in other countries where PV performance and grid behavior are different.
Resumo:
Construction planning plays a fundamental role in construction project management that requires team working among planners from a diverse range of disciplines and in geographically dispersed working situations. Model-based four-dimensional (4D) computer-aided design (CAD) groupware, though considered a possible approach to supporting collaborative planning, is still short of effective collaborative mechanisms for teamwork due to methodological, technological and social challenges. Targeting this problem, this paper proposes a model-based groupware solution to enable a group of multidisciplinary planners to perform real-time collaborative 4D planning across the Internet. In the light of the interactive definition method, and its computer-supported collaborative work (CSCW) design analysis, the paper discusses the realization of interactive collaborative mechanisms from software architecture, application mode, and data exchange protocol. These mechanisms have been integrated into a groupware solution, which was validated by a planning team in a truly geographically dispersed condition. Analysis of the validation results revealed that the proposed solution is feasible for real-time collaborative 4D planning to gain a robust construction plan through collaborative teamwork. The realization of this solution triggers further considerations about its enhancement for wider groupware applications.
Resumo:
Existing research on synchronous remote working in CSCW has highlighted the troubles that can arise because actions at one site are (partially) unavailable to remote colleagues. Such ‘local action’ is routinely characterised as a nuisance, a distraction, subordinate and the like. This paper explores interconnections between ‘local action’ and ‘distributed work’ in the case of a research team virtually collocated through ‘MiMeG’. MiMeG is an e-Social Science tool that facilitates ‘distributed data sessions’ in which social scientists are able to remotely collaborate on the real-time analysis of video data. The data are visible and controllable in a shared workspace and participants are additionally connected via audio conferencing. The findings reveal that whilst the (partial) unavailability of local action is at times problematic, it is also used as a resource for coordinating work. The paper considers how local action is interactionally managed in distributed data sessions and concludes by outlining implications of the analysis for the design and study of technologies to support group-to-group collaboration.
Resumo:
This paper introduces the Hilbert Analysis (HA), which is a novel digital signal processing technique, for the investigation of tremor. The HA is formed by two complementary tools, i.e. the Empirical Mode Decomposition (EMD) and the Hilbert Spectrum (HS). In this work we show that the EMD can automatically detect and isolate tremulous and voluntary movements from experimental signals collected from 31 patients with different conditions. Our results also suggest that the tremor may be described by a new class of mathematical functions defined in the HA framework. In a further study, the HS was employed for visualization of the energy activities of signals. This tool introduces the concept of instantaneous frequency in the field of tremor. In addition, it could provide, in a time-frequency-energy plot, a clear visualization of local activities of tremor energy over the time. The HA demonstrated to be very useful to perform objective measurements of any kind of tremor and can therefore be used to perform functional assessment.
Resumo:
This paper introduces the Hilbert Analysis (HA), which is a novel digital signal processing technique, for the investigation of tremor. The HA is formed by two complementary tools, i.e. the Empirical Mode Decomposition (EMD) and the Hilbert Spectrum (HS). In this work we show that the EMD can automatically detect and isolate tremulous and voluntary movements from experimental signals collected from 31 patients with different conditions. Our results also suggest that the tremor may be described by a new class of mathematical functions defined in the HA framework. In a further study, the HS was employed for visualization of the energy activities of signals. This tool introduces the concept of instantaneous frequency in the field of tremor. In addition, it could provide, in a time-frequency energy plot, a clear visualization of local activities of tremor energy over the time. The HA demonstrated to be very useful to perform objective measurements of any kind of tremor and can therefore be used to perform functional assessment.
Resumo:
Intelligent viewing systems are required if efficient and productive teleoperation is to be applied to dynamic manufacturing environments. These systems must automatically provide remote views to an operator which assist in the completion of the task. This assistance increases the productivity of the teleoperation task if the robot controller is responsive to the unpredictable dynamic evolution of the workcell. Behavioral controllers can be utilized to give reactive 'intelligence.' The inherent complex structure of current systems, however, places considerable time overheads on any redesign of the emergent behavior. In industry, where the remote environment and task frequently change, this continual redesign process becomes inefficient. We introduce a novel behavioral controller, based on an 'ego-behavior' architecture, to command an active camera (a camera mounted on a robot) within a remote workcell. Using this ego-behavioral architecture the responses from individual behaviors are rapidly combined to produce an 'intelligent' responsive viewing system. The architecture is single-layered, each behavior being autonomous with no explicit knowledge of the number, description or activity of other behaviors present (if any). This lack of imposed structure decreases the development time as it allows each behavior to be designed and tested independently before insertion into the architecture. The fusion mechanism for the behaviors provides the ability for each behavior to compete and/or co-operate with other behaviors for full or partial control of the viewing active camera. Each behavior continually reassesses this degree of competition or co-operation by measuring its own success in controlling the active camera against pre-defined constraints. The ego-behavioral architecture is demonstrated through simulation and experimentation.
Resumo:
The collection of wind speed time series by means of digital data loggers occurs in many domains, including civil engineering, environmental sciences and wind turbine technology. Since averaging intervals are often significantly larger than typical system time scales, the information lost has to be recovered in order to reconstruct the true dynamics of the system. In the present work we present a simple algorithm capable of generating a real-time wind speed time series from data logger records containing the average, maximum, and minimum values of the wind speed in a fixed interval, as well as the standard deviation. The signal is generated from a generalized random Fourier series. The spectrum can be matched to any desired theoretical or measured frequency distribution. Extreme values are specified through a postprocessing step based on the concept of constrained simulation. Applications of the algorithm to 10-min wind speed records logged at a test site at 60 m height above the ground show that the recorded 10-min values can be reproduced by the simulated time series to a high degree of accuracy.
Resumo:
This dissertation deals with aspects of sequential data assimilation (in particular ensemble Kalman filtering) and numerical weather forecasting. In the first part, the recently formulated Ensemble Kalman-Bucy (EnKBF) filter is revisited. It is shown that the previously used numerical integration scheme fails when the magnitude of the background error covariance grows beyond that of the observational error covariance in the forecast window. Therefore, we present a suitable integration scheme that handles the stiffening of the differential equations involved and doesn’t represent further computational expense. Moreover, a transform-based alternative to the EnKBF is developed: under this scheme, the operations are performed in the ensemble space instead of in the state space. Advantages of this formulation are explained. For the first time, the EnKBF is implemented in an atmospheric model. The second part of this work deals with ensemble clustering, a phenomenon that arises when performing data assimilation using of deterministic ensemble square root filters in highly nonlinear forecast models. Namely, an M-member ensemble detaches into an outlier and a cluster of M-1 members. Previous works may suggest that this issue represents a failure of EnSRFs; this work dispels that notion. It is shown that ensemble clustering can be reverted also due to nonlinear processes, in particular the alternation between nonlinear expansion and compression of the ensemble for different regions of the attractor. Some EnSRFs that use random rotations have been developed to overcome this issue; these formulations are analyzed and their advantages and disadvantages with respect to common EnSRFs are discussed. The third and last part contains the implementation of the Robert-Asselin-Williams (RAW) filter in an atmospheric model. The RAW filter is an improvement to the widely popular Robert-Asselin filter that successfully suppresses spurious computational waves while avoiding any distortion in the mean value of the function. Using statistical significance tests both at the local and field level, it is shown that the climatology of the SPEEDY model is not modified by the changed time stepping scheme; hence, no retuning of the parameterizations is required. It is found the accuracy of the medium-term forecasts is increased by using the RAW filter.
Resumo:
The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.
Resumo:
The issue of diversification in direct real estate investment portfolios has been widely studied in academic and practitioner literature. Most work, however, has been done using either partially aggregated data or data for small samples of individual properties. This paper reports results from tests of both risk reduction and diversification that use the records of 10,000+ UK properties tracked by Investment Property Databank. It provides, for the first time, robust estimates of the diversification gains attainable given the returns, risks and cross‐correlations across the individual properties available to fund managers. The results quantify the number of assets and amount of money needed to construct both ‘balanced’ and ‘specialist’ property portfolios by direct investment. Target numbers will vary according to the objectives of investors and the degree to which tracking error is tolerated. The top‐level results are consistent with previous work, showing that a large measure of risk reduction can be achieved with portfolios of 30–50 properties, but full diversification of specific risk can only be achieved in very large portfolios. However, the paper extends previous work by demonstrating on a single, large dataset the implications of different methods of calculating risk reduction, and also by showing more disaggregated results relevant to the construction of specialist, sector‐focussed funds.