839 resultados para the fundamental supermode
Resumo:
This paper sets out the findings of a group of research and development projects carried out at the Department of Real Estate & Planning at the University of Reading and at Oxford Property Systems over the period 1999 – 2003. The projects have several aims: these are to identify the fundamental drivers of the pricing of different lease terms in the UK property sector; to identify current and best market practice and uncover the main variations in lease terms; to identify key issues in pricing lease terms; and to develop a model for the pricing of rent under a variety of lease variations. From the landlord’s perspective, the main factors driving the required ‘compensation’ for a lease term amendment include expected rental volatility, expected probability of tenant vacation, and the expected costs of tenant vacation. These data are used in conjunction with simulation technology to reflect the options inherent in certain lease types to explore the required rent adjustment. The resulting cash flows have interesting qualities which illustrate the potential importance of option pricing in a non-complex and practical way.
Resumo:
We review the proposal of the International Committee for Weights and Measures (Comité International des Poids et Mesures, CIPM), currently being considered by the General Conference on Weights and Measures (Conférences Générales des Poids et Mesures, CGPM), to revise the International System of Units (Le Système International d’Unitès, SI). The proposal includes new definitions for four of the seven base units of the SI, and a new form of words to present the definitions of all the units. The objective of the proposed changes is to adopt definitions referenced to constants of nature, taken in the widest sense, so that the definitions may be based on what are believed to be true invariants. In particular, whereas in the current SI the kilogram, ampere, kelvin and mole are linked to exact numerical values of the mass of the international prototype of the kilogram, the magnetic constant (permeability of vacuum), the triple-point temperature of water and the molar mass of carbon-12, respectively, in the new SI these units are linked to exact numerical values of the Planck constant, the elementary charge, the Boltzmann constant and the Avogadro constant, respectively. The new wording used expresses the definitions in a simple and unambiguous manner without the need for the distinction between base and derived units. The importance of relations among the fundamental constants to the definitions, and the importance of establishing a mise en pratique for the realization of each definition, are also discussed.
Resumo:
This article presents findings and seeks to establish the theoretical markers that indicate the growing importance of fact-based drama in screen and theatre performance to the wider Anglophone culture. During the final decade of the twentieth century and the opening one of the twenty-first, television docudrama and documentary theatre have grown in visibility and importance in the UK, providing key responses to social, cultural and political change over the millennial period. Actors were the prime focus for the enquiry principally because so little research has been done into the special demands that fact-based performance makes on them. The main emphasis in actor training (in the UK at any rate) is, as it always has been, on preparation for fictional drama. Preparation in acting schools is also heavily geared towards stage performance. Our thesis was that performers called upon to play the roles of real people, in whatever medium, have added responsibilities both towards history and towards real individuals and their families. Actors must engage with ethical questions whether they like it or not, and we found them keenly aware of this. In the course of the research, we conducted 30 interviews with a selection of actors ranging from the experienced to the recently-trained. We also interviewed a few industry professionals and actor trainers. Once the interviews started it was clear that actors themselves made little or no distinction between how they set about their work for television and film. The essential disciplines for work in front of the camera, they told us, are the same whether the camera is electronic or photographic. Some adjustments become necessary, of course in the multi-camera TV studio. But much serious drama for the screen is made on film anyway. We found it was also the case that young actors now tend to get their first paid employment before a camera rather than on a stage. The screen-before-stage tendency, along with the fundamental re-shaping that has gone on in the British theatre since at least the early 1980s, had implications for actor training. We have also found that theatre work still tends to be most valued by actors. For all the actors we interviewed, theatre was what they liked doing best because it was there they could practice and develop their skills, there they could work most collectively towards performance, and there they could more directly experience audience feedback in the real time of the stage play. The current world of television has been especially constrained in regard to rehearsal time in comparison to theatre (and, to a lesser extent, film). This has also affected actors’ valuation of their work. Theatre is, and is not, the most important medium in which they find work. Theatre is most important spiritually and intellectually, because in theatre is collaborative, intensive, and involving; theatre is not as important in financial and career terms, because it is not as lucrative and not as visible to a large public as acting for the screen. Many actors took the view that, for all the industrial differences that do affect them and inevitably interest the academic, acting for the visible media of theatre, film and television involved fundamentally the same process with slightly different emphases.
Resumo:
As in any technology systems, analysis and design issues are among the fundamental challenges in persuasive technology. Currently, the Persuasive Systems Development (PSD) framework is considered to be the most comprehensive framework for designing and evaluation of persuasive systems. However, the framework is limited in terms of providing detailed information which can lead to selection of appropriate techniques depending on the variable nature of users or use over time. In light of this, we propose a model which is intended for analysing and implementing behavioural change in persuasive technology called the 3D-RAB model. The 3D-RAB model represents the three dimensional relationships between attitude towards behaviour, attitude towards change or maintaining a change, and current behaviour, and distinguishes variable levels in a user’s cognitive state. As such it provides a framework which could be used to select appropriate techniques for persuasive technology.
Resumo:
A number of recent articles emphasize the fundamental importance of taphonomy and formation processes to interpretation of plant remains assemblages, as well as the value of interdisciplinary approaches to studies of environmental change and ecological and social practices. This paper examines ways in which micromorphology can contribute to integrating geoarchaeology and archaeobotany in analysis of the taphonomy and context of plant remains and ecological and social practices. Micromorphology enables simultaneous in situ study of diverse plant materials and thereby traces of a range of depositional pathways and histories. In addition to charred plant remains, also often preserved in semi-arid environments are plant impressions, phytoliths and calcitic ashes. These diverse plant remains are often routinely separated and extracted from their depositional context or lost using other analytical techniques, thereby losing crucial evidence on taphonomy, formation processes and contextual associations, which are fundamental to all subsequent interpretations. Although micromorphological samples are small in comparison to bulk flotation samples of charred plant remains, their size is similar to phytolith and pollen samples. In this paper, key taphonomic issues are examined in the study of: fuel; animal dung, animal management and penning; building materials; and specific activities, including food storage and preparation and ritual, using selected case-studies from early urban settlements in the Ancient Near East. Microarchaeological residues and experimental archaeology are also briefly examined.
Resumo:
Forest canopies are important components of the terrestrial carbon budget, which has motivated a worldwide effort, FLUXNET, to measure CO2 exchange between forests and the atmosphere. These measurements are difficult to interpret and to scale up to estimate exchange across a landscape. Here we review the effects of complex terrain on the mean flow, turbulence, and scalar exchange in canopy flows, as exemplified by adjustment to forest edges and hills, including the effects of stable stratification. We focus on the fundamental fluid mechanics, in which developments in theory, measurements, and modeling, particularly through large-eddy simulation, are identifying important processes and providing scaling arguments. These developments set the stage for the development of predictive models that can be used in combination with measurements to estimate exchange at the landscape scale.
Resumo:
This paper reviews the impact of the global financial crisis on financial system reform in China. Scholars and practitioners have critically questioned the efficiencies of the Anglo- American principal-agent model of corporate governance which promotes shareholder-value maximisation. Should China continue to follow the U.K.-U.S. path in relation to financial reform? This conceptual paper provides an insightful review of the corporate governance literature, regulatory reports and news articles from the financial press. After examining the fundamental limitations of the laissez-faire philosophy that underpins the neo-liberal model of capitalism, the paper considers the risks in opening up China’s financial markets and relaxing monetary and fiscal policies. The paper outlines a critique of shareholder-capitalism in relation to the German team-production model of corporate governance, promoting a “social market economy” styled capitalism. Through such analysis, the paper explores numerous implications for China to consider in terms of developing a new and sustainable corporate governance model. China needs to follow its own financial reform through understanding its particular economy. The global financial crisis might help China rethink the nature of corporate governance, identify its weakness and assess the current reform agenda.
Resumo:
As in any technology systems, analysis and design issues are among the fundamental challenges in persuasive technology. Currently, the Persuasive Systems Development (PSD) framework is considered to be the most comprehensive framework for designing and evaluation of persuasive systems. However, the framework is limited in terms of providing detailed information which can lead to selection of appropriate techniques depending on the variable nature of users or use over time. In light of this, we propose a model which is intended for analysing and implementing behavioural change in persuasive technology called the 3D-RAB model. The 3D-RAB model represents the three dimensional relationships between attitude towards behaviour, attitude towards change or maintaining a change, and current behaviour, and distinguishes variable levels in a user’s cognitive state. As such it provides a framework which could be used to select appropriate techniques for persuasive technology.
Resumo:
Acrylamide is formed from reducing sugars and asparagine during the preparation of French fries. The commercial preparation of French fries is a multi-stage process involving the preparation of frozen, par-fried potato strips for distribution to catering outlets where they are finish fried. The initial blanching, treatment in glucose solution and par-frying steps are crucial since they determine the levels of precursors present at the beginning of the finish frying process. In order to minimize the quantities of acrylamide in cooked fries, it is important to understand the impact of each stage on the formation of acrylamide. Acrylamide, amino acids, sugars, moisture, fat and color were monitored at time intervals during the frying of potato strips which had been dipped in varying concentrations of glucose and fructose during a typical pretreatment. A mathematical model of the finish-frying was developed based on the fundamental chemical reaction pathways, incorporating moisture and temperature gradients in the fries. This showed the contribution of both glucose and fructose to the generation of acrylamide, and accurately predicted the acrylamide content of the final fries.
Resumo:
This article discusses the sources of competitive advantage in the interwar British radio industry. Specifically, it examines why sections of the industry that reaped substantial monopoly rents from the downstream value chain failed to dominate the industry. During the 1920s Marconi (which controlled the fundamental UK patents) had a key cost advantage, as had other members of the ‘Big Six’ electrical engineering firms which formed the BBC and were granted preferential royalties. Meanwhile the valve manufacturers' cartel was also able to extract high rents from set manufacturers. The vertical integration literature suggests that input monopolists have incentives to control downstream production. Yet—in contrast to the gramophone industry, which became concentrated into two huge companies following market saturation in the 1930s—radio retained a much more competitive structure. The Big Six failed to capitalize fully on their initial cost advantages owing to logistical weaknesses in supplying markets subject to rapid technical and design obsolescence. Subsequently, during the 1930s, marketing innovations are shown to have played a key role in allowing several independents to establish successful brands. This gave them sufficient scale to provide strong bargaining positions with input suppliers, negating most of their initial cost disadvantage.
Resumo:
Acrylamide is formed from reducing sugars and asparagine during the preparation of French fries. The commercial preparation of French fries is a multistage process involving the preparation of frozen, par-fried potato strips for distribution to catering outlets, where they are finish-fried. The initial blanching, treatment in glucose solution, and par-frying steps are crucial because they determine the levels of precursors present at the beginning of the finish-frying process. To minimize the quantities of acrylamide in cooked fries, it is important to understand the impact of each stage on the formation of acrylamide. Acrylamide, amino acids, sugars, moisture, fat, and color were monitored at time intervals during the frying of potato strips that had been dipped in various concentrations of glucose and fructose during a typical pretreatment. A mathematical model based on the fundamental chemical reaction pathways of the finish-frying was developed, incorporating moisture and temperature gradients in the fries. This showed the contribution of both glucose and fructose to the generation of acrylamide and accurately predicted the acrylamide content of the final fries.
Resumo:
Social housing policy in the UK mirrors wider processes Associated with shifts in broad welfare regimes. Social housing has moved from dominance by state housing provision to the funding of new investment through voluntary sector housing associations to what is now a greater focus on the regulation and private financing of these not-for-profit bodies. If these trends run their course, we are likely to see a range of not-for-profit bodies providing non-market housing in a highly regulated quasi-market. This paper examines these issues through the lens of new institutional economics, which it is believed can provide important insights into the fundamental contractual and regulatory relationships that are coming to dominate social housing from the perspective of the key actors in the sector (not-for-profit housing organisations, their tenants, private lenders and the regulatory state). The paper draws on evidence recently collected from a study evaluating more than 100 stock transfer organisations that inherited ex-public housing in Scotland, including 12 detailed case studies. The paper concludes that social housing stakeholders need to be aware of the risks (and their management) faced across the sector and that the state needs to have clear objectives for social housing and coherent policy instruments to achieve those ends.
Resumo:
Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.
Resumo:
The chemical specificity of terahertz spectroscopy, when combined with techniques for sub-wavelength sensing, is giving new understanding of processes occurring at the nanometre scale in biological systems and offers the potential for single molecule detection of chemical and biological agents and explosives. In addition, terahertz techniques are enabling the exploration of the fundamental behaviour of light when it interacts with nanoscale optical structures, and are being used to measure ultrafast carrier dynamics, transport and localisation in nanostructures. This chapter will explain how terahertz scale modelling can be used to explore the fundamental physics of nano-optics, it will discuss the terahertz spectroscopy of nanomaterials, terahertz near-field microscopy and other sub-wavelength techniques, and summarise recent developments in the terahertz spectroscopy and imaging of biological systems at the nanoscale. The potential of using these techniques for security applications will be considered.
Resumo:
Studying the pathogenesis of an infectious disease like colibacillosis requires an understanding of the responses of target hosts to the organism both as a pathogen and as a commensal. The mucosal immune system constitutes the primary line of defence against luminal micro-organisms. The immunoglobulin-superfamily-based adaptive immune system evolved in the earliest jawed vertebrates, and the adaptive and innate immune system of humans, mice, pigs and ruminants co-evolved in common ancestors for approximately 300 million years. The divergence occurred only 100 mya and, as a consequence, most of the fundamental immunological mechanisms are very similar. However, since pressure on the immune system comes from rapidly evolving pathogens, immune systems must also evolve rapidly to maintain the ability of the host to survive and reproduce. As a consequence, there are a number of areas of detail where mammalian immune systems have diverged markedly from each other, such that results obtained in one species are not always immediately transferable to another. Thus, animal models of specific diseases need to be selected carefully, and the results interpreted with caution. Selection is made simpler where specific host species like cattle and pigs can be both target species and reservoirs for human disease, as in infections with Escherichia coli.