42 resultados para SAY

em CentAUR: Central Archive University of Reading - UK


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent developments in contracting practice in the UK have built upon recommendations contained in highprofile reports, such as those by Latham and Egan. However, the New Engineering Contract (NEC), endorsed by Latham, is based upon principles of contract drafting that seem open to question. Any contract operates in the context of its legislative environment and current working practices. This report identifies eight contentious hypotheses in the literature on construction contracts and tests their validity in a sample survey that attracted 190 responses. The survey shows, among other things, that while partnership is a positive and useful idea, authoritative contract management is considered more effective and that “win-win” contracts, while desirable, are basically impractical. Further, precision and fairness in contracts are not easy to achieve simultaneously. While participants should know what is in their contracts, they should not routinely resort to legal action; and standard-form contracts should not seek to be universally applicable. Fundamental changes to drafting policy should be undertaken within the context of current legal contract doctrine and with a sensitivity to the way that contracts are used in contemporary practice. Attitudes to construction contracting may seem to be changing on the surface, but detailed analysis of what lies behind apparent agreement on new ways of working reveals that attitudes are changing much more slowly than they appear to be.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taking a generative perspective, we divide aspects of language into three broad categories: those that cannot be learned (are inherent in Universal Grammar), those that are derived from Universal Grammar, and those that must be learned from the input. Using this framework of language to clarify the “what” of learning, we take the acquisition of null (and overt) subjects in languages like Spanish as an example of how to apply the framework. We demonstrate what properties of a null-subject grammar cannot be learned explicitly, which properties can, but also argue that it is an open empirical question as to whether these latter properties are learned using explicit processes, showing how linguistic and psychological approaches may intersect to better understand acquisition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents results from a project designed to explore the meaning and function of partnership within the Catholic Church development chain. The geography literature has had little to say about such aid chains, especially those founded on faith-based groups. The relationships between three Catholic Church-based donors - referred to as A, B and C - with development personnel of the diocese of the Abuja Ecclesiastical Province (AEP) as well as other Catholic Church structures in Nigeria were analysed. The aim was to explore the forces behind the relationships and how 'patchy' these relationships were in AEP. Respondents were asked to give each of the donors a score in relation to four questions covering their relationship with the donors. Results suggest that the modus operandi of donor 'A' allows it to be perceived as the 'best' partner, while 'B' was scored less favourably because of a perception that it attempts to act independently of existing structures in Nigeria rather than work through them. There was significant variation between diocese in this regard, as well as between the diocese and other structures of the Church (Provinces, Inter-Provinces and National Secretariat). Thus 'partnership' in the Catholic Church aid chain is a highly complex, contested and 'visioned' term and the development of an analytical framework has to take account of these fundamentals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The El Niño–Southern Oscillation (ENSO) is a naturally occurring fluctuation that originates in the tropical Pacific region and affects ecosystems, agriculture, freshwater supplies, hurricanes and other severe weather events worldwide. Under the influence of global warming, the mean climate of the Pacific region will probably undergo significant changes. The tropical easterly trade winds are expected to weaken; surface ocean temperatures are expected to warm fastest near the equator and more slowly farther away; the equatorial thermocline that marks the transition between the wind-mixed upper ocean and deeper layers is expected to shoal; and the temperature gradients across the thermocline are expected to become steeper. Year-to-year ENSO variability is controlled by a delicate balance of amplifying and damping feedbacks, and one or more of the physical processes that are responsible for determining the characteristics of ENSO will probably be modified by climate change. Therefore, despite considerable progress in our understanding of the impact of climate change on many of the processes that contribute to El Niño variability, it is not yet possible to say whether ENSO activity will be enhanced or damped, or if the frequency of events will change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The vibration-rotation Raman spectrum of the ν2 and ν5 fundamentals of CH3F is reported, from 1320 to 1640 cm−1, with a resolution of about 0.3 cm−1. The Coriolis resonance between the two bands leads to many perturbation-allowed transitions. Where the resonance is still sufficiently weak that the quantum number K′ retains its meaning, perturbation-allowed transitions are observed for all values of ΔK from +4 to −4; in regions of strong resonance, however, we can only say that the observed transitions obey the selection rule Δ(k−l) = 0 or ±3. The spectrum has been analyzed by band contour simulation using a computer program based on exact diagonalization of the Hamiltonian within the ν2, ν5 vibrational levels, and improved vibration-rotation constants for these bands are reported. The relative magnitudes and relative sings of polarizability derivatives involved in these vibrations are also reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Capillary electrophoresis (CE) offers the analyst a number of key advantages for the analysis of the components of foods. CE offers better resolution than, say, high-performance liquid chromatography (HPLC), and is more adept at the simultaneous separation of a number of components of different chemistries within a single matrix. In addition, CE requires less rigorous sample cleanup procedures than HPLC, while offering the same degree of automation. However, despite these advantages, CE remains under-utilized by food analysts. Therefore, this review consolidates and discusses the currently reported applications of CE that are relevant to the analysis of foods. Some discussion is also devoted to the development of these reported methods and to the advantages/disadvantages compared with the more usual methods for each particular analysis. It is the aim of this review to give practicing food analysts an overview of the current scope of CE.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is well known that conversationalists often imitate their own body language as a sign of closeness and empathy. This study shows that in spontaneous, unplanned conversation, speakers go as far as emulating each other's grammar. The use of a family of focusing constructions (namely, the cleft), such as it was my mother who rang the other day, or what I meant to say was that he should go Thursday, was investigated in a corpus of conversation excerpts in New Zealand English. Findings show that clefting is contagious. In other words, if one speaker uses a cleft, others will be likely to do so too.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Greek speakers say "ovpa", Germans "schwanz'' and the French "queue'' to describe what English speakers call a 'tail', but all of these languages use a related form of 'two' to describe the number after one. Among more than 100 Indo-European languages and dialects, the words for some meanings (such as 'tail') evolve rapidly, being expressed across languages by dozens of unrelated words, while others evolve much more slowly-such as the number 'two', for which all Indo-European language speakers use the same related word-form(1). No general linguistic mechanism has been advanced to explain this striking variation in rates of lexical replacement among meanings. Here we use four large and divergent language corpora (English(2), Spanish(3), Russian(4) and Greek(5)) and a comparative database of 200 fundamental vocabulary meanings in 87 Indo-European languages(6) to show that the frequency with which these words are used in modern language predicts their rate of replacement over thousands of years of Indo-European language evolution. Across all 200 meanings, frequently used words evolve at slower rates and infrequently used words evolve more rapidly. This relationship holds separately and identically across parts of speech for each of the four language corpora, and accounts for approximately 50% of the variation in historical rates of lexical replacement. We propose that the frequency with which specific words are used in everyday language exerts a general and law-like influence on their rates of evolution. Our findings are consistent with social models of word change that emphasize the role of selection, and suggest that owing to the ways that humans use language, some words will evolve slowly and others rapidly across all languages.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligent buildings should provide a multi-sensory experience so that visual, aural, tactile, olfactory and gustatory senses are stimulated appropriately. A lack of environmental stimuli produces a boring and unsatisfying environment. It is now known that the environment affects people at deeper levels than, say, health and safety, and consequently it can modify moods and work performance. A holistic approach is proposed which recognizes that the physical environment together with social, organizational and personal factors can enhance the productivity of occupants. This approach provides a footprint for the design of healthier and more sustainable workplaces.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper demonstrates that recent influential contributions to monetary policy imply an emerging consensus whereby neither rigid rules nor complete discretion are found optimal. Instead, middle-ground monetary regimes based on rules (operative under 'normal' circumstances) to anchor inflation expectations over the long run, but designed with enough flexibility to mitigate the short-run effect of shocks (with communicated discretion in 'exceptional' circumstances temporarily overriding these rules), are gaining support in theoretical models and policy formulation and implementation. The opposition of 'rules versus discretion' has, thus, reappeared as the synthesis of 'rules cum discretion', in essence as inflation-forecast targeting. But such synthesis is not without major theoretical problems, as we argue in this contribution. Furthermore, the very recent real-world events have made it obvious that the inflation targeting strategy of monetary policy, which rests upon the new consensus paradigm in modern macroeconomics is at best a 'fair weather' model. In the turbulent economic climate of highly unstable inflation, deep financial crisis and world-wide, abrupt economic slowdown nowadays this approach needs serious rethinking to say the least, if not abandoning it altogether