34 resultados para personnel and shift scheduling
em CentAUR: Central Archive University of Reading - UK
Resumo:
The main aim of the present article is to test hypotheses derived from the model for contact- induced language change as formulated in Thomason and Kaufman (1988 et seq.). As the model correctly predicts the asymmetries between the mutual influences of the Germanic and the Romance varieties in Brussels and Strasbourg it is a very powerful tool for describing the contact patterns in these cities. The analysis shows that the contact patterns are very similar, both from a quantitative and from a qualitative point of view, despite important differences in the sociolinguistic situation of both cities. The striking similarities in the outcome of language contact seem to find a plausible explanation in the fact that the language contact situations in both cities are similar from a typological point of view: in each city a variety of French is in contact with a Germanic variety (Alsatian and Brussels Dutch). Thus, the claim of the present article is that the structure of the languages plays a more prominent role in the outcome of language contact than the sociolinguistic history of the speakers.
Resumo:
The annual survey of corporate real estate practices has been conducted by CREMRU since 1993 and in collaboration with Johnsons Controls Inc. since 1997. This year the survey forms the first stage of a broader research project: International Survey of Corporate Real Estate Practices: longitudinal study 1993-2002, being undertaken for the Innovative Construction Research Centre at the University of Reading, funded by the Engineering and Physical Sciences Research Council. The survey has been endorsed by CoreNet, the leading professional association concerned with corporate real estate, which opened it to a wider audience. This summary of the ten annual surveys focuses on the incidence of corporate real estate management (CREM) policies, functions and activities, as well as the assessment of knowledge or skills relevant to the CREM function in the future. Both are of vital interest to educational institutions concerned with this field, as well as the personnel and training functions within organisations concerned with better management of their property.
Resumo:
Background: Shifting gaze and attention ahead of the hand is a natural component in the performance of skilled manual actions. Very few studies have examined the precise co-ordination between the eye and hand in children with Developmental Coordination Disorder (DCD). Methods This study directly assessed the maturity of eye-hand co-ordination in children with DCD. A double-step pointing task was used to investigate the coupling of the eye and hand in 7-year-old children with and without DCD. Sequential targets were presented on a computer screen, and eye and hand movements were recorded simultaneously. Results There were no differences between typically developing (TD) and DCD groups when completing fast single-target tasks. There were very few differences in the completion of the first movement in the double-step tasks, but differences did occur during the second sequential movement. One factor appeared to be the propensity for the DCD children to delay their hand movement until some period after the eye had landed on the target. This resulted in a marked increase in eye-hand lead during the second movement, disrupting the close coupling and leading to a slower and less accurate hand movement among children with DCD. Conclusions In contrast to skilled adults, both groups of children preferred to foveate the target prior to initiating a hand movement if time allowed. The TD children, however, were more able to reduce this foveation period and shift towards a feedforward mode of control for hand movements. The children with DCD persevered with a look-then-move strategy, which led to an increase in error. For the group of DCD children in this study, there was no evidence of a problem in speed or accuracy of simple movements, but there was a difficulty in concatenating the sequential shifts of gaze and hand required for the completion of everyday tasks or typical assessment items.
Resumo:
Increasingly, the UK’s Private Finance Initiative has created a demand for construction companies to transfer knowledge from one organization or project to another. Knowledge transfer processes in such contexts face many challenges, due to the many resulting discontinuities in the involvement of organisations, personnel and information flow. This paper empirically identifies the barriers and enablers that hinder or enhance the transfer of knowledge in PFI contexts, drawing upon a questionnaire survey of construction firms. The main findings show that knowledge transfer processes in PFIs are hindered by time constraints, lack of trust, and policies, procedures, rules and regulations attached to the projects. Nevertheless, the processes of knowledge transfer are enhanced by emphasising the value and importance of a supportive leadership, participation/commitment from the relevant parties, and good communication between the relevant parties. The findings have considerable relevance to understanding the mechanism of knowledge transfer between organizations, projects and individuals within the PFI contexts in overcoming the barriers and enhancing the enablers. Furthermore, practitioners and managers can use the findings to efficiently design knowledge transfer frameworks that can be used to overcome the barriers encountered while enhancing the enablers to improve knowledge transfer processes.
Resumo:
The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.
Resumo:
We compared output from 3 dynamic process-based models (DMs: ECOSSE, MILLENNIA and the Durham Carbon Model) and 9 bioclimatic envelope models (BCEMs; including BBOG ensemble and PEATSTASH) ranging from simple threshold to semi-process-based models. Model simulations were run at 4 British peatland sites using historical climate data and climate projections under a medium (A1B) emissions scenario from the 11-RCM (regional climate model) ensemble underpinning UKCP09. The models showed that blanket peatlands are vulnerable to projected climate change; however, predictions varied between models as well as between sites. All BCEMs predicted a shift from presence to absence of a climate associated with blanket peat, where the sites with the lowest total annual precipitation were closest to the presence/absence threshold. DMs showed a more variable response. ECOSSE predicted a decline in net C sink and shift to net C source by the end of this century. The Durham Carbon Model predicted a smaller decline in the net C sink strength, but no shift to net C source. MILLENNIA predicted a slight overall increase in the net C sink. In contrast to the BCEM projections, the DMs predicted that the sites with coolest temperatures and greatest total annual precipitation showed the largest change in carbon sinks. In this model inter-comparison, the greatest variation in model output in response to climate change projections was not between the BCEMs and DMs but between the DMs themselves, because of different approaches to modelling soil organic matter pools and decomposition amongst other processes. The difference in the sign of the response has major implications for future climate feedbacks, climate policy and peatland management. Enhanced data collection, in particular monitoring peatland response to current change, would significantly improve model development and projections of future change.
Resumo:
The practice of partial depopulation or ‘thinning’, i.e. early removal of a proportion of birds from a commercial broiler flock, is a reported risk factor for Campylobacter colonization of residual birds because of the difficulty in maintaining biosecurity during the process. Therefore, the effect of this practice was studied in detail for 51 target flocks, each at a different growing farm belonging to one of seven major poultry companies throughout the United Kingdom. On 21 of these farms, the target flock was already colonized by Campylobacter and at slaughter all cecal samples examined were positive, with a mean of log10 8 cfu / g. A further 27 flocks became positive within 2 – 6 days of the start of thinning and had similarly high levels of cecal carriage at slaughter. Just prior to the thinning process, Campylobacter could be isolated frequently from the farm driveways, transport vehicles, equipment and personnel. Strains from seven such farms on which flocks became colonized after thinning were examined by PFGE typing. The study demonstrated an association between strains occurring at specific sampling sites and those isolated subsequently from the thinned flocks. There were also indications that particular strains had spread from one farm to another, when the farms were jointly company-owned and served by the same bird-catching teams and / or vehicles. The results highlighted the need for better hygiene control in relation to catching equipment and personnel, and more effective cleaning and disinfection of vehicles, and bird-transport crates.
Resumo:
A numerical study of fluid mechanics and heat transfer in a scraped surface heat exchanger with non-Newtonian power law fluids is undertaken. Numerical results are generated for 2D steady-state conditions using finite element methods. The effect of blade design and material properties, and especially the independent effects of shear thinning and heat thinning on the flow and heat transfer, are studied. The results show that the gaps at the root of the blades, where the blades are connected to the inner cylinder, remove the stagnation points, reduce the net force on the blades and shift the location of the central stagnation point. The shear thinning property of the fluid reduces the local viscous dissipation close to the singularity corners, i.e. near the tip of the blades, and as a result the local fluid temperature is regulated. The heat thinning effect is greatest for Newtonian fluids where the viscous dissipation and the local temperature are highest at the tip of the blades. Where comparison is possible, very good agreement is found between the numerical results and the available data. Aspects of scraped surface heat exchanger design are assessed in the light of the results. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
A Fractal Quantizer is proposed that replaces the expensive division operation for the computation of scalar quantization by more modest and available multiplication, addition and shift operations. Although the proposed method is iterative in nature, simulations prove a virtually undetectable distortion to the naked eve for JPEG compressed images using a single iteration. The method requires a change to the usual tables used in JPEG algorithins but of similar size. For practical purposes, performing quantization is reduced to a multiplication plus addition operation easily programmed in either low-end embedded processors and suitable for efficient and very high speed implementation in ASIC or FPGA hardware. FPGA hardware implementation shows up to x15 area-time savingscompared to standars solutions for devices with dedicated multipliers. The method can be also immediately extended to perform adaptive quantization(1).
Resumo:
Epitaxial ultrathin titanium dioxide films of 0.3 to similar to 7 nm thickness on a metal single crystal substrate have been investigated by high resolution vibrational and electron spectroscopies. The data complement previous morphological data provided by scanned probe microscopy and low energy electron diffraction to provide very complete characterization of this system. The thicker films display electronic structure consistent with a stoichiometric TiO2 phase. The thinner films appear nonstoichiometric due to band bending and charge transfer from the metal substrate, while work function measurements also show a marked thickness dependence. The vibrational spectroscopy shows three clear phonon bands at 368, 438, and 829 cm(-1) (at 273 K), which confirms a rutile structure. The phonon band intensity scales linearly with film thickness and shift slightly to lower frequencies with increasing temperature, in accord with results for single crystals. (c) 2007 American Institute of Physics.
Resumo:
Oxford University Press’s response to technological change in printing and publishing processes in this period can be considered in three phases: an initial period when the computerization of typesetting was seen as offering both cost savings and the ability to produce new editions of existing works more quickly; an intermediate phase when the emergence of standards in desktop computing allowed experiments with the sale of software as well as packaged electronic publications; and a third phase when the availability of the world wide web as a means of distribution allowed OUP to return to publishing in its traditional areas of strength albeit in new formats. Each of these phases demonstrates a tension between a desire to develop centralized systems and expertise, and a recognition that dynamic publishing depends on distributed decision-making and innovation. Alongside these developments in production and distribution lay developments in computer support for managerial and collaborative publishing processes, often involving the same personnel and sometimes the same equipment.
Resumo:
Cross-layer techniques represent efficient means to enhance throughput and increase the transmission reliability of wireless communication systems. In this paper, a cross-layer design of aggressive adaptive modulation and coding (A-AMC), truncated automatic repeat request (T-ARQ), and user scheduling is proposed for multiuser multiple-input-multiple-output (MIMO) maximal ratio combining (MRC) systems, where the impacts of feedback delay (FD) and limited feedback (LF) on channel state information (CSI) are also considered. The A-AMC and T-ARQ mechanism selects the appropriate modulation and coding schemes (MCSs) to achieve higher spectral efficiency while satisfying the service requirement on the packet loss rate (PLR), profiting from the feasibility of using different MCSs to retransmit a packet, which is destined to a scheduled user selected to exploit multiuser diversity and enhance the system's performance in terms of both transmission efficiency and fairness. The system's performance is evaluated in terms of the average PLR, average spectral efficiency (ASE), outage probability, and average packet delay, which are derived in closed form, considering transmissions over Rayleigh-fading channels. Numerical results and comparisons are provided and show that A-AMC combined with T-ARQ yields higher spectral efficiency than the conventional scheme based on adaptive modulation and coding (AMC), while keeping the achieved PLR closer to the system's requirement and reducing delay. Furthermore, the effects of the number of ARQ retransmissions, numbers of transmit and receive antennas, normalized FD, and cardinality of the beamforming weight vector codebook are studied and discussed.