15 resultados para WORK METHODS

em CentAUR: Central Archive University of Reading - UK


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – Facilities managers have less visibility of how buildings are being used due to flexible working and unpredictable workers. The purpose of this paper is to examine the current issues in workspace management and an automatic solution through radio frequency identification (RFID) that could provide real time information on the volume and capacity of buildings. Design/methodology/approach – The study described in this paper is based on a case study at a facilities management (FM) department. The department is examining a ubiquitous technology in the form of innovative RFID for security and workspace management. Interviews and observations are conducted within the facilities department for the initial phase of the implementation of RFID technology. Findings – Research suggests that work methods are evolving and becoming more flexible. With this in mind, facilities managers face new challenges to create a suitable environment for an unpredictable workforce. RFID is one solution that could provide facilities managers with an automatic way of examining space in real time and over a wider area than currently possible. RFID alone for space management is financially expensive but by making the application multiple for other areas makes more business sense. Practical implications – This paper will provide practicing FM and academics with the knowledge gained from the application of RFID in this organisation. While the concept of flexible working seems attractive, there is an emerging need to provide various forms of spaces that enable employees' satisfaction and enhance the productivity of the organisation. Originality/value – The paper introduces new thinking on the subject of “workspace management”. It highlights the current difficulties in workspace management and how an RFID solution will benefit workspace methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – Facilities managers have less visibility of how buildings are being used due to flexible working and unpredictable workers. The purpose of this paper is to examine the current issues in workspace management and an automatic solution through radio frequency identification (RFID) that could provide real time information on the volume and capacity of buildings. Design/methodology/approach – The study described in this paper is based on a case study at a facilities management (FM) department. The department is examining a ubiquitous technology in the form of innovative RFID for security and workspace management. Interviews and observations are conducted within the facilities department for the initial phase of the implementation of RFID technology. Findings – Research suggests that work methods are evolving and becoming more flexible. With this in mind, facilities managers face new challenges to create a suitable environment for an unpredictable workforce. RFID is one solution that could provide facilities managers with an automatic way of examining space in real time and over a wider area than currently possible. RFID alone for space management is financially expensive but by making the application multiple for other areas makes more business sense. Practical implications – This paper will provide practicing FM and academics with the knowledge gained from the application of RFID in this organisation. While the concept of flexible working seems attractive, there is an emerging need to provide various forms of spaces that enable employees’ satisfaction and enhance the productivity of the organisation. Originality/value – The paper introduces new thinking on the subject of “workspace management”. It highlights the current difficulties in workspace management and how an RFID solution will benefit workspace methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work was to study the effects of drying methods and conditions (i.e., ambient drying, hot air drying at 40 degrees C, vacuum drying and low-pressure superheated steam drying within the temperature range of 70-90 degrees C at an absolute pressure of 10 kPa) as well as the concentration of galangal extract on the antimicrobial activity of edible chitosan films against Staphylococcus aureus. Galangal extract was added to the film forming solution as a natural antimicrobial agent in the concentration range of 0.3-0.9 g/100 g. Fourier transform infrared (FTIR) spectra and swelling of the films were also evaluated to investigate interaction between chitosan and the galangal extract. The antimicrobial activity of the films was evaluated by the disc diffusion and viable cell count method, while the morphology of bacteria treated with the antimicrobial films was observed via transmission electron microscopy (TEM). The antimicrobial activity, swelling and functional group interaction of the antimicrobial films were found to be affected by the drying methods and conditions as well as the concentration of the galangal extract. The electron microscopic observations revealed that cell wall and cell membrane of S. aureus treated by the antimicrobial films were significantly damaged. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This case study uses log-linear modelling to investigate the interrelationships between factors that may contribute to the late submission of coursework by undergraduate students. A class of 86 computing students are considered. These students were exposed to traditional teaching methods supported by e-learning via a Managed Learning Environment (MLE). The MLE warehouses detailed data about student usage of the various areas of the environment, which can be used to interpret the approach taken to learning. The study investigates the interrelationship between these factors with the information as to whether the student handed in their course work on time or whether they were late. The results from the log-linear modelling technique show that there is an interaction between participating in Discussions within the MLE and the timely submission of course work, indicating that participants are more likely to hand in on time, than those students who do not participate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modelling the interaction of terahertz(THz) radiation with biological tissueposes many interesting problems. THzradiation is neither obviously described byan electric field distribution or anensemble of photons and biological tissueis an inhomogeneous medium with anelectronic permittivity that is bothspatially and frequency dependent making ita complex system to model.A three-layer system of parallel-sidedslabs has been used as the system throughwhich the passage of THz radiation has beensimulated. Two modelling approaches havebeen developed a thin film matrix model anda Monte Carlo model. The source data foreach of these methods, taken at the sametime as the data recorded to experimentallyverify them, was a THz spectrum that hadpassed though air only.Experimental verification of these twomodels was carried out using athree-layered in vitro phantom. Simulatedtransmission spectrum data was compared toexperimental transmission spectrum datafirst to determine and then to compare theaccuracy of the two methods. Goodagreement was found, with typical resultshaving a correlation coefficient of 0.90for the thin film matrix model and 0.78 forthe Monte Carlo model over the full THzspectrum. Further work is underway toimprove the models above 1 THz.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The physical and empirical relationships used by microphysics schemes to control the rate at which vapor is transferred to ice crystals growing in supercooled clouds are compared with laboratory data to evaluate the realism of various model formulations. Ice crystal growth rates predicted from capacitance theory are compared with measurements from three independent laboratory studies. When the growth is diffusion- limited, the predicted growth rates are consistent with the measured values to within about 20% in 14 of the experiments analyzed, over the temperature range −2.5° to −22°C. Only two experiments showed significant disagreement with theory (growth rate overestimated by about 30%–40% at −3.7° and −10.6°C). Growth predictions using various ventilation factor parameterizations were also calculated and compared with supercooled wind tunnel data. It was found that neither of the standard parameterizations used for ventilation adequately described both needle and dendrite growth; however, by choosing habit-specific ventilation factors from previous numerical work it was possible to match the experimental data in both regimes. The relationships between crystal mass, capacitance, and fall velocity were investigated based on the laboratory data. It was found that for a given crystal size the capacitance was significantly overestimated by two of the microphysics schemes considered here, yet for a given crystal mass the growth rate was underestimated by those same schemes because of unrealistic mass/size assumptions. The fall speed for a given capacitance (controlling the residence time of a crystal in the supercooled layer relative to its effectiveness as a vapor sink, and the relative importance of ventilation effects) was found to be overpredicted by all the schemes in which fallout is permitted, implying that the modeled crystals reside for too short a time within the cloud layer and that the parameterized ventilation effect is too strong.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many published methods available for creating keyphrases for documents. Previous work in the field has shown that in a significant proportion of cases author selected keyphrases are not appropriate for the document they accompany. This requires the use of such automated methods to improve the use of keyphrases. Often the keyphrases are not updated when the focus of a paper changes or include keyphrases that are more classificatory than explanatory. The published methods are all evaluated using different corpora, typically one relevant to their field of study. This not only makes it difficult to incorporate the useful elements of algorithms in future work but also makes comparing the results of each method inefficient and ineffective. This paper describes the work undertaken to compare five methods across a common baseline of six corpora. The methods chosen were term frequency, inverse document frequency, the C-Value, the NC-Value, and a synonym based approach. These methods were compared to evaluate performance and quality of results, and to provide a future benchmark. It is shown that, with the comparison metric used for this study Term Frequency and Inverse Document Frequency were the best algorithms, with the synonym based approach following them. Further work in the area is required to determine an appropriate (or more appropriate) comparison metric.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the error dynamics for cycled data assimilation systems, such that the inverse problem of state determination is solved at tk, k = 1, 2, 3, ..., with a first guess given by the state propagated via a dynamical system model from time tk − 1 to time tk. In particular, for nonlinear dynamical systems that are Lipschitz continuous with respect to their initial states, we provide deterministic estimates for the development of the error ||ek|| := ||x(a)k − x(t)k|| between the estimated state x(a) and the true state x(t) over time. Clearly, observation error of size δ > 0 leads to an estimation error in every assimilation step. These errors can accumulate, if they are not (a) controlled in the reconstruction and (b) damped by the dynamical system under consideration. A data assimilation method is called stable, if the error in the estimate is bounded in time by some constant C. The key task of this work is to provide estimates for the error ||ek||, depending on the size δ of the observation error, the reconstruction operator Rα, the observation operator H and the Lipschitz constants K(1) and K(2) on the lower and higher modes of controlling the damping behaviour of the dynamics. We show that systems can be stabilized by choosing α sufficiently small, but the bound C will then depend on the data error δ in the form c||Rα||δ with some constant c. Since ||Rα|| → ∞ for α → 0, the constant might be large. Numerical examples for this behaviour in the nonlinear case are provided using a (low-dimensional) Lorenz '63 system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The author developed two GUIs for asymptotic Bode plots and identification from such plots aimed at improving the learning of frequency response methods: these were presented at UKACC Control 2012. Student feedback and reflection by the author suggested various improvements to these GUIs, which have now been implemented. This paper reviews the earlier work, describes the improvements, and includes positive feedback from the students on the GUIs and how they have helped their understanding of the methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

European labour markets are increasingly divided between insiders in full-time permanent employment and outsiders in precarious work or unemployment. Using quantitative as well as qualitative methods, this thesis investigates the determinants and consequences of labour market policies that target these outsiders in three separate papers. The first paper looks at Active Labour Market Policies (ALMPs) that target the unemployed. It shows that left and right-wing parties choose different types of ALMPs depending on the policy and the welfare regime in which the party is located. These findings reconcile the conflicting theoretical expectations from the Power Resource approach and the insider-outsider theory. The second paper considers the regulation and protection of the temporary work sector. It solves the puzzle of temporary re-regulation in France, which contrasts with most other European countries that have deregulated temporary work. Permanent workers are adversely affected by the expansion of temporary work in France because of general skills and low wage coordination. The interests of temporary and permanent workers for re-regulation therefore overlap in France and left governments have an incentive to re-regulate the sector. The third paper then investigates what determines inequality between median and bottom income workers. It shows that non-inclusive economic coordination increases inequality in the absence of compensating institutions such as minimum wage regulation. The deregulation of temporary work as well as spending on employment incentives and rehabilitation also has adverse effects on inequality. Thus, policies that target outsiders have important economic effects on the rest of the workforce. Three broader contributions can be identified. First, welfare state policies may not always be in the interests of labour, so left parties may not always promote them. Second, the interests of insiders and outsiders are not necessarily at odds. Third, economic coordination may not be conducive to egalitarianism where it is not inclusive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To clarify how infection control requirements are represented, communicated, and understood in work interactions through the medical facility construction project life cycle. To assist project participants with effective infection control management by highlighting the nature of such requirements and presenting recommendations to aid practice. Background: A 4-year study regarding client requirement representation and use on National Health Service construction projects in the United Kingdom provided empirical evidence of infection control requirement communication and understanding through design and construction work interactions. Methods: An analysis of construction project resources (e.g., infection control regulations and room data sheets) was combined with semi-structured interviews with hospital client employees and design and construction professionals to provide valuable insights into the management of infection control issues. Results: Infection control requirements are representationally indistinct but also omnipresent through all phases of the construction project life cycle: Failure to recognize their nature, relevance, and significance can result in delays, stoppages, and redesign work. Construction project resources (e.g., regulatory guidance and room data sheets) can mask or obscure the meaning of infection control issues. Conclusions: A preemptive identification of issues combined with knowledge sharing activities among project stakeholders can enable infection control requirements to be properly understood and addressed. Such initiatives should also reference existing infection control regulatory guidance and advice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Of the many sources of urban greenhouse gas (GHG) emissions, solid waste is the only one for which management decisions are undertaken primarily by municipal governments themselves and is hence often the largest component of cities’ corporate inventories. It is essential that decision-makers select an appropriate quantification methodology and have an appreciation of methodological strengths and shortcomings. This work compares four different waste emissions quantification methods, including Intergovernmental Panel on Climate Change (IPCC) 1996 guidelines, IPCC 2006 guidelines, U.S. Environmental Protection Agency (EPA) Waste Reduction Model (WARM), and the Federation of Canadian Municipalities- Partners for Climate Protection (FCM-PCP) quantification tool. Waste disposal data for the greater Toronto area (GTA) in 2005 are used for all methodologies; treatment options (including landfill, incineration, compost, and anaerobic digestion) are examined where available in methodologies. Landfill was shown to be the greatest source of GHG emissions, contributing more than three-quarters of total emissions associated with waste management. Results from the different landfill gas (LFG) quantification approaches ranged from an emissions source of 557 kt carbon dioxide equivalents (CO2e) (FCM-PCP) to a carbon sink of −53 kt CO2e (EPA WARM). Similar values were obtained between IPCC approaches. The IPCC 2006 method was found to be more appropriate for inventorying applications because it uses a waste-in-place (WIP) approach, rather than a methane commitment (MC) approach, despite perceived onerous data requirements for WIP. MC approaches were found to be useful from a planning standpoint; however, uncertainty associated with their projections of future parameter values limits their applicability for GHG inventorying. MC and WIP methods provided similar results in this case study; however, this is case specific because of similarity in assumptions of present and future landfill parameters and quantities of annual waste deposited in recent years being relatively consistent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Replacement, expansion and upgrading of assets in the electricity network represents financial investment for the distribution utilities. Network Investment Deferral (NID) is a well discussed benefit of wider adoption of Distributed Generation (DG). There have been many attempts to quantify and evaluate the financial benefit for the distribution utilities. While the carbon benefits of NID are commonly mentioned, there is little attempt to quantify these impacts. This paper explores the quantitative methods previously used to evaluate financial benefits in order to discuss the carbon impacts. These carbon impacts are important for companies owning DG equipment for internal reporting and emissions reductions ambitions. Currently, a GB wide approach is taken as a means for discussing more regional and local methods to be used in future work. By investigating these principles, the paper offers a novel approach to quantifying carbon emissions from various DG technologies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this work is the efficient solution of the heat equation with Dirichlet or Neumann boundary conditions using the Boundary Elements Method (BEM). Efficiently solving the heat equation is useful, as it is a simple model problem for other types of parabolic problems. In complicated spatial domains as often found in engineering, BEM can be beneficial since only the boundary of the domain has to be discretised. This makes BEM easier than domain methods such as finite elements and finite differences, conventionally combined with time-stepping schemes to solve this problem. The contribution of this work is to further decrease the complexity of solving the heat equation, leading both to speed gains (in CPU time) as well as requiring smaller amounts of memory to solve the same problem. To do this we will combine the complexity gains of boundary reduction by integral equation formulations with a discretisation using wavelet bases. This reduces the total work to O(h