93 resultados para whether sufficient element of compromise

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

During deglaciation of the North American Laurentide Ice Sheet large proglacial lakes developed in positions where proglacial drainage was impeded by the ice margin. For some of these lakes, it is known that subsequent drainage had an abrupt and widespread impact on North Atlantic Ocean circulation and climate, but less is known about the impact that the lakes exerted on ice sheet dynamics. This paper reports palaeogeographic reconstructions of the evolution of proglacial lakes during deglaciation across the northwestern Canadian Shield, covering an area in excess of 1,000,000 km(2) as the ice sheet retreated some 600 km. The interactions between proglacial lakes and ice sheet flow are explored, with a particular emphasis on whether the disposition of lakes may have influenced the location of the Dubawnt Lake ice stream. This ice stream falls outside the existing paradigm for ice streams in the Laurentide Ice Sheet because it did not operate over fined-grained till or lie in a topographic trough. Ice margin positions and a digital elevation model are utilised to predict the geometry and depth of proglacial takes impounded at the margin at 30-km increments during deglaciation. Palaeogeographic reconstructions match well with previous independent estimates of lake coverage inferred from field evidence, and results suggest that the development of a deep lake in the Thelon drainage basin may have been influential in initiating the ice stream by inducing calving, drawing down ice and triggering fast ice flow. This is the only location alongside this sector of the ice sheet where large (>3000 km(2)), deep lakes (similar to120 m) are impounded for a significant length of time and exactly matches the location of the ice stream. It is speculated that the commencement of calving at the ice sheet margin may have taken the system beyond a threshold and was sufficient to trigger rapid motion but that once initiated, calving processes and losses were insignificant to the functioning of the ice stream. It is thus concluded that proglacial lakes are likely to have been an important control on ice sheet dynamics during deglaciation of the Laurentide Ice Sheet. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the use of pH and calcium ion electrodes for investigating factors affecting the heat stability of UHT milk with added calcium chloride. Calcium chloride was added to raw milk to manipulate ionic calcium and pH to within the range that may be typically encountered in raw milk of different compositions and microbial quality. Addition of only 5 mM calcium chloride was sufficient to induce considerable changes in pH, ionic calcium and ethanol stability and alter its stability to UHT treatment. There was a strong relationship between pH decrease and increase in ionic calcium when pH was reduced, whether by addition of calcium chloride or by acidification. Calcium chloride addition was found to increase sediment formation in UHT treated milk. However, sediment could be reduced by addition of stabilizers. Those most effective were ones which decreased ionic calcium and increased pH, such as trisodium citrate and disodium hydrogen phosphate. Sediment formation following UHT treatment was only slight for milk samples whose ethanol stability was greater than 80%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Samples of Norway spruce wood were impregnated with a water-soluble melamine formaldehyde resin by using short-term vacuum treatment and long-term immersion, respectively. By means of Fourier transform infrared (FTIR) spectroscopy and UV microspectrophotometry, it was shown that only diffusion during long-term immersion leads to sufficient penetration of melamine resin into the wood structure, the flow of liquids in Norway spruce wood during vacuum treatment being greatly hindered by aspirated pits. After an immersion in aqueous melamine resin solution for 3 days, the resin had penetrated to a depth > 4 mm, which, after polymerization of the resin, resulted in an improvement of hardness comparable to the hardwood beech. A finite element model describing the effect of increasing depth of modification on hardness demonstrated that under the test conditions chosen for this study, a minimum impregnation depth of 2 mm is necessary to achieve an optimum increase in hardness. (C) 2004 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In collaborative situations, eye gaze is a critical element of behavior which supports and fulfills many activities and roles. In current computer-supported collaboration systems, eye gaze is poorly supported. Even in a state-of-the-art video conferencing system such as the access grid, although one can see the face of the user, much of the communicative power of eye gaze is lost. This article gives an overview of some preliminary work that looks towards integrating eye gaze into an immersive collaborative virtual environment and assessing the impact that this would have on interaction between the users of such a system. Three experiments were conducted to assess the efficacy of eye gaze within immersive virtual environments. In each experiment, subjects observed on a large screen the eye-gaze behavior of an avatar. The eye-gaze behavior of that avatar had previously been recorded from a user with the use of a head-mounted eye tracker. The first experiment was conducted to assess the difference between users' abilities to judge what objects an avatar is looking at with only head gaze being viewed and also with eye- and head-gaze data being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects, correctly identifying what a person is looking at in an immersive virtual environment. The second experiment examined whether a monocular or binocular eye-tracker would be required. This was examined by testing subjects' ability to identify where an avatar was looking from their eye direction alone, or by eye direction combined with convergence. This experiment showed that convergence had a significant impact on the subjects' ability to identify where the avatar was looking. The final experiment looked at the effects of stereo and mono-viewing of the scene, with the subjects being asked to identify where the avatar was looking. This experiment showed that there was no difference in the subjects' ability to detect where the avatar was gazing. This is followed by a description of how the eye-tracking system has been integrated into an immersive collaborative virtual environment and some preliminary results from the use of such a system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Research in the late 1980s showed that in many corporate real estates users were not fully aware of the full extent of their property holdings. In many cases, not only was the value of the holdings unknown, but there was uncertainty over the actual extent of ownership within the portfolio. This resulted in a large number of corporate occupiers reviewing their property holdings during the 1990s, initially to create a definitive asset register, but also to benefit from an more efficient use of space. Good management of corporately owned property assets is of equal importance as the management of other principal resources within the company. A comprehensive asset register can be seen as the first step towards a rational property audit. For the effective, efficient and economic delivery of services, it is vital that all property holdings are utilised to the best advantage. This requires that the property provider and the property user are both fully conversant with the value of the property holding and that an asset/internal rent/charge is made accordingly. The advantages of internal rent charging are twofold. Firstly, it requires the occupying department to “contribute” an amount to the business equivalent to the open market rental value of the space that it occupies. This prevents the treating of space as a free good and, as individual profit centres, each department will then rationalise its holdings to minimise its costs. The second advantage is from a strategic viewpoint. By charging an asset rent, the holding department can identify the performance of its real estate holdings. This can then be compared to an internal or external benchmark to help determine whether the company has adopted the most efficient tenure pattern for its properties. This paper investigates the use of internal rents by UK-based corporate businesses and explains internal rents as a form of transfer pricing in the context of management and responsibility accounting. The research finds that the majority of charging organisations introduced internal rents primarily to help calculate true profits at the business unit level. However, less than 10% of the charging organisations introduced internal rents primarily to capture the return on assets within the business. There was also a sizeable element of the market who had no plans to introduce internal rents. Here, it appears that, despite academic and professional views that internal rents are beneficial in improving the efficient use of property, opinion at the business and operational level has not universally accepted this proposition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is considerable interest in the use of porous asphalt (PA) surfacing on highways since physical and subjective assessments of noise have indicated a significant advantage over conventional non-porous surfaces such as hot rolled asphalt (HRA) used widely for motorway surfacing in the UK. However, it was not known whether the benefit of the PA surface was affected by the presence of roadside barriers. Noise predictions have been made using the Boundary Element Method (BEM) approach to determine the extent to which the noise reducing benefits of PA could be added to the screening effects of noise barriers in order to obtain the overall reduction in noise levels

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Kazakhstan, a transitional nation in Central Asia, the development of public–private partnerships (PPPs) is at its early stage and increasingly of strategic importance. This case study investigates risk allocation in an ongoing project: the construction and operation of 11 kindergartens in the city of Karaganda in the concession form for 14 years. Drawing on a conceptual framework of effective risk allocation, the study identifies principal PPP risks, provides a critical assessment of how and in what way each partner bears a certain risk, highlights the reasons underpinning risk allocation decisions and delineates the lessons learned. The findings show that the government has effectively transferred most risks to the private sector partner, whilst both partners share the demand risk of childcare services and the project default risk. The strong elements of risk allocation include clear assignment of parties’ responsibilities, streamlined financing schemes and incentives to complete the main project phases on time. However, risk allocation has missed an opportunity to create incentives for service quality improvements and take advantage of economies of scale. The most controversial element of risk allocation, as the study finds, is a revenue stream that an operator is supposed to receive from the provision of services unrelated to childcare, as neither partner is able to mitigate this revenue risk. The article concludes that in the kindergartens’ PPP, the government has achieved almost complete transfer of risks to the private sector partner. However, the costs of transfer are extensive government financial outlays that seriously compromise the PPP value for money.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This chapter re-evaluates the diachronic, evolutionist model that establishes the Second World War as a watershed between classical and modern cinemas, and ‘modernity’ as the political project of ‘slow cinema’. I will start by historicising the connection between cinematic speed and modernity, going on to survey the veritable obsession with the modern that continues to beset film studies despite the vagueness and contradictions inherent in the term. I will then attempt to clarify what is really at stake within the modern-classical debate by analysing two canonical examples of Japanese cinema, drawn from the geidomono genre (films on the lives of theatre actors), Kenji Mizoguchi’s Story of the Late Chrysanthemums (Zangiku monogatari, 1939) and Yasujiro Ozu’s Floating Weeds (Ukigusa, 1954), with a view to investigating the role of the long take or, conversely, classical editing, in the production or otherwise of a supposed ‘slow modernity’. By resorting to Ozu and Mizoguchi, I hope to demonstrate that the best narrative films in the world have always combined a ‘classical’ quest for perfection with the ‘modern’ doubt of its existence, hence the futility of classifying cinema in general according to an evolutionary and Eurocentric model based on the classical-modern binary. Rather than on a confusing politics of the modern, I will draw on Bazin’s prophetic insight of ‘impure cinema’, a concept he forged in defence of literary and theatrical screen adaptations. Anticipating by more than half a century the media convergence on which the near totality of our audiovisual experience is currently based, ‘impure cinema’ will give me the opportunity to focus on the confluence of film and theatre in these Mizoguchi and Ozu films as the site of a productive crisis where established genres dissolve into self-reflexive stasis, ambiguity of expression and the revelation of the reality of the film medium, all of which, I argue, are more reliable indicators of a film’s political programme than historical teleology. At the end of the journey, some answers may emerge to whether the combination of the long take and the long shot are sufficient to account for a film’s ‘slowness’ and whether ‘slow’ is indeed the best concept to signify resistance to the destructive pace of capitalism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the last decades, research on knowledge economies has taken central stage. Within this broader research field, research on the role of digital technologies and the creative industries has become increasingly important for researchers, academics and policy makers with particular focus on their development, supply-chains and models of production. Furthermore, many have recognised that, despite the important role played by digital technologies and innovation in the development of the creative industries, these dynamics are hard to capture and quantify. Digital technologies are embedded in the production and market structures of the creative industries and are also partially distinct and discernible from it. They also seem to play a key role in innovation of access and delivery of creative content. This chapter tries to assess the role played by digital technologies focusing on a key element of their implementation and application: human capital. Using student micro-data collected by the Higher Education Statistical Agency (HESA) in the United Kingdom, we explore the characteristics and location patterns of graduates who entered the creative industries, specifically comparing graduates in the creative arts and graduates from digital technology subjects. We highlight patterns of geographical specialisation but also how different context are able to better integrate creativity and innovation in their workforce. The chapter deals specifically with understanding whether these skills are uniformly embedded across the creative sector or are concentrated in specific sub-sectors of the creative industries. Furthermore, it explores the role that these graduates play in different sub-sector of the creative economy, their economic rewards and their geographical determinants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are at least three distinct time scales that are relevant for the evolution of atmospheric convection. These are the time scale of the forcing mechanism, the time scale governing the response to a steady forcing, and the time scale of the response to variations in the forcing. The last of these, tmem, is associated with convective life cycles, which provide an element of memory in the system. A highly simplified model of convection is introduced, which allows for investigation of the character of convection as a function of the three time scales. For short tmem, the convective response is strongly tied to the forcing as in conventional equilibrium parameterization. For long tmem, the convection responds only to the slowly evolving component of forcing, and any fluctuations in the forcing are essentially suppressed. At intermediate tmem, convection becomes less predictable: conventional equilibrium closure breaks down and current levels of convection modify the subsequent response.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The piriform cortex (PC) is highly prone to epileptogenesis, particularly in immature animals, where decreased muscarinic modulation of PC intrinsic fibre excitatory neurotransmission is implicated as a likely cause. However, whether higher levels of acetylcholine (ACh) release occur in immature vs. adult PC remains unclear. We investigated this using in vitro extracellular electrophysiological recording techniques. Intrinsic fibre-evoked extracellular field potentials (EFPs) were recorded from layers II to III in PC brain slices prepared from immature (P14-18) and adult (P>40) rats. Adult and immature PC EFPs were suppressed by eserine (1muM) or neostigmine (1muM) application, with a greater suppression in immature ( approximately 40%) than adult ( approximately 30%) slices. Subsequent application of atropine (1muM) reversed EFP suppression, producing supranormal ( approximately 12%) recovery in adult slices, suggesting that suppression was solely muscarinic ACh receptor-mediated and that some 'basal' cholinergic 'tone' was present. Conversely, atropine only partially reversed anticholinesterase effects in immature slices, suggesting the presence of additional non-muscarinic modulation. Accordingly, nicotine (50muM) caused immature field suppression ( approximately 30%) that was further enhanced by neostigmine, whereas it had no effect on adult EFPs. Unlike atropine, nicotinic antagonists, mecamylamine and methyllycaconitine, induced immature supranormal field recovery ( approximately 20%) following anticholinesterase-induced suppression (with no effect on adult slices), confirming that basal cholinergic 'tone' was also present. We suggest that nicotinic inhibitory cholinergic modulation occurs in the immature rat PC intrinsic excitatory fibre system, possibly to complement the existing, weak muscarinic modulation, and could be another important developmentally regulated system governing immature PC susceptibility towards epileptogenesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we use the no-response test idea, introduced in Luke and Potthast (2003) and Potthast (Preprint) and the inverse obstacle problem, to identify the interface of the discontinuity of the coefficient gamma of the equation del (.) gamma(x)del + c(x) with piecewise regular gamma and bounded function c(x). We use infinitely many Cauchy data as measurement and give a reconstructive method to localize the interface. We will base this multiwave version of the no-response test on two different proofs. The first one contains a pointwise estimate as used by the singular sources method. The second one is built on an energy (or an integral) estimate which is the basis of the probe method. As a conclusion of this, the probe and the singular sources methods are equivalent regarding their convergence and the no-response test can be seen as a unified framework for these methods. As a further contribution, we provide a formula to reconstruct the values of the jump of gamma(x), x is an element of partial derivative D at the boundary. A second consequence of this formula is that the blow-up rate of the indicator functions of the probe and singular sources methods at the interface is given by the order of the singularity of the fundamental solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The validity of convective parametrization breaks down at the resolution of mesoscale models, and the success of parametrized versus explicit treatments of convection is likely to depend on the large-scale environment. In this paper we examine the hypothesis that a key feature determining the sensitivity to the environment is whether the forcing of convection is sufficiently homogeneous and slowly varying that the convection can be considered to be in equilibrium. Two case studies of mesoscale convective systems over the UK, one where equilibrium conditions are expected and one where equilibrium is unlikely, are simulated using a mesoscale forecasting model. The time evolution of area-average convective available potential energy and the time evolution and magnitude of the timescale of convective adjustment are consistent with the hypothesis of equilibrium for case 1 and non-equilibrium for case 2. For each case, three experiments are performed with different partitionings between parametrized and explicit convection: fully parametrized convection, fully explicit convection and a simulation with significant amounts of both. In the equilibrium case, bulk properties of the convection such as area-integrated rain rates are insensitive to the treatment of convection. However, the detailed structure of the precipitation field changes; the simulation with parametrized convection behaves well and produces a smooth field that follows the forcing region, and the simulation with explicit convection has a small number of localized intense regions of precipitation that track with the mid-levelflow. For the non-equilibrium case, bulk properties of the convection such as area-integrated rain rates are sensitive to the treatment of convection. The simulation with explicit convection behaves similarly to the equilibrium case with a few localized precipitation regions. In contrast, the cumulus parametrization fails dramatically and develops intense propagating bows of precipitation that were not observed. The simulations with both parametrized and explicit convection follow the pattern seen in the other experiments, with a transition over the duration of the run from parametrized to explicit precipitation. The impact of convection on the large-scaleflow, as measured by upper-level wind and potential-vorticity perturbations, is very sensitive to the partitioning of convection for both cases. © Royal Meteorological Society, 2006. Contributions by P. A. Clark and M. E. B. Gray are Crown Copyright.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Halberda (2003) demonstrated that 17-month-old infants, but not 14- or 16-month-olds, use a strategy known as mutual exclusivity (ME) to identify the meanings of new words. When 17-month-olds were presented with a novel word in an intermodal preferential looking task, they preferentially fixated a novel object over an object for which they already had a name. We explored whether the development of this word-learning strategy is driven by children's experience of hearing only one name for each referent in their environment by comparing the behavior of infants from monolingual and bilingual homes. Monolingual infants aged 17–22 months showed clear evidence of using an ME strategy, in that they preferentially fixated the novel object when they were asked to "look at the dax." Bilingual infants of the same age and vocabulary size failed to show a similar pattern of behavior. We suggest that children who are raised with more than one language fail to develop an ME strategy in parallel with monolingual infants because development of the bias is a consequence of the monolingual child's everyday experiences with words.