833 resultados para Many-To-One Matching Market
Resumo:
Investments in direct real estate are inherently difficult to segment compared to other asset classes due to the complex and heterogeneous nature of the asset. The most common segmentation in real estate investment analysis relies on property sector and geographical region. In this paper, we compare the predictive power of existing industry classifications with a new type of segmentation using cluster analysis on a number of relevant property attributes including the equivalent yield and size of the property as well as information on lease terms, number of tenants and tenant concentration. The new segments are shown to be distinct and relatively stable over time. In a second stage of the analysis, we test whether the newly generated segments are able to better predict the resulting financial performance of the assets than the old dichotomous segments. Applying both discriminant and neural network analysis we find mixed evidence for this hypothesis. Overall, we conclude from our analysis that each of the two approaches to segmenting the market has its strengths and weaknesses so that both might be applied gainfully in real estate investment analysis and fund management.
Resumo:
There have been various techniques published for optimizing the net present value of tenders by use of discounted cash flow theory and linear programming. These approaches to tendering appear to have been largely ignored by the industry. This paper utilises six case studies of tendering practice in order to establish the reasons for this apparent disregard. Tendering is demonstrated to be a market orientated function with many subjective judgements being made regarding a firm's environment. Detailed consideration of 'internal' factors such as cash flow are therefore judged to be unjustified. Systems theory is then drawn upon and applied to the separate processes of estimating and tendering. Estimating is seen as taking place in a relatively sheltered environment and as such operates as a relatively closed system. Tendering, however, takes place in a changing and dynamic environment and as such must operate as a relatively open system. The use of sophisticated methods to optimize the value of tenders is then identified as being dependent upon the assumption of rationality, which is justified in the case of a relatively closed system (i.e. estimating), but not for a relatively open system (i.e. tendering).
Resumo:
Research in the late 1980s showed that in many corporate real estates users were not fully aware of the full extent of their property holdings. In many cases, not only was the value of the holdings unknown, but there was uncertainty over the actual extent of ownership within the portfolio. This resulted in a large number of corporate occupiers reviewing their property holdings during the 1990s, initially to create a definitive asset register, but also to benefit from an more efficient use of space. Good management of corporately owned property assets is of equal importance as the management of other principal resources within the company. A comprehensive asset register can be seen as the first step towards a rational property audit. For the effective, efficient and economic delivery of services, it is vital that all property holdings are utilised to the best advantage. This requires that the property provider and the property user are both fully conversant with the value of the property holding and that an asset/internal rent/charge is made accordingly. The advantages of internal rent charging are twofold. Firstly, it requires the occupying department to “contribute” an amount to the business equivalent to the open market rental value of the space that it occupies. This prevents the treating of space as a free good and, as individual profit centres, each department will then rationalise its holdings to minimise its costs. The second advantage is from a strategic viewpoint. By charging an asset rent, the holding department can identify the performance of its real estate holdings. This can then be compared to an internal or external benchmark to help determine whether the company has adopted the most efficient tenure pattern for its properties. This paper investigates the use of internal rents by UK-based corporate businesses and explains internal rents as a form of transfer pricing in the context of management and responsibility accounting. The research finds that the majority of charging organisations introduced internal rents primarily to help calculate true profits at the business unit level. However, less than 10% of the charging organisations introduced internal rents primarily to capture the return on assets within the business. There was also a sizeable element of the market who had no plans to introduce internal rents. Here, it appears that, despite academic and professional views that internal rents are beneficial in improving the efficient use of property, opinion at the business and operational level has not universally accepted this proposition.
Resumo:
Unhealthy diets can lead to various diseases, which in turn can translate into a bigger burden for the state in the form of health services and lost production. Obesity alone has enormous costs and claims thousands of lives every year. Although diet quality in the European Union has improved across countries, it still falls well short of conformity with the World Health Organization dietary guidelines. In this review, we classify types of policy interventions addressing healthy eating and identify through a literature review what specific policy interventions are better suited to improve diets. Policy interventions are classified into two broad categories: information measures and measures targeting the market environment. Using this classification, we summarize a number of previous systematic reviews, academic papers, and institutional reports and draw some conclusions about their effectiveness. Of the information measures, policy interventions aimed at reducing or banning unhealthy food advertisements generally have had a weak positive effect on improving diets, while public information campaigns have been successful in raising awareness of unhealthy eating but have failed to translate the message into action. Nutritional labeling allows for informed choice. However, informed choice is not necessarily healthier; knowing or being able to read and interpret nutritional labeling on food purchased does not necessarily result in consumption of healthier foods. Interventions targeting the market environment, such as fiscal measures and nutrient, food, and diet standards, are rarer and generally more effective, though more intrusive. Overall, we conclude that measures to support informed choice have a mixed and limited record of success. On the other hand, measures to target the market environment are more intrusive but may be more effective.
Resumo:
The aim of this article is to identify the key factors that are associated with the adoption of a commercial robot in the home. This article is based on the development of the robot product Cybot by the University of Reading in conjunction with a publisher (Eaglemoss International Ltd.). The robots were distributed through a new part-work magazine series (Ultimate Real Robots) that had long-term customer usage and retention. A part-work is a serial publication that is issued periodically (e.g., every two weeks), usually in magazine format, and builds into a complete collection. This magazine focused on robotics and was accompanied by cover-mounted component parts that could be assembled, with instructions, by the user to build a working robot over the series. In total, the product contributed over half a million operational domestic robots to the world market, selling over 20 million robot part-work magazines across 18 countries, thereby providing a unique breadth of insight. Gaining a better understanding of the overall attitudes that customers of this product had toward robots in the home, their perception of what such devices could deliver and how they would wish to interact with them should provide results applicable to the domestic appliance, assistance/care, entertainment, and educational markets.
Resumo:
The British system of development control is time-consuming and uncertain in outcome. Moreover, it is becoming increasingly overloaded as it has gradually switched away from being centred on a traditional ‘is it an appropriate land-use?’ type approach to one based on multi-faceted inspections of projects and negotiations over the distribution of the potential financial gains arising from them. Recent policy developments have centred on improving the operation of development control. This paper argues that more fundamental issues may be a stake as well. Important market changes have increased workloads. Furthermore, the UK planning system's institutional framework encourages change to move in specific directions, which is not always helpful. If expectations of increased long-term housing supply are to be met more substantial changes to development control may be essential but hard to achieve.
Resumo:
There is growing international interest in the impact of regulatory controls on the supply of housing The UK has a particularly restrictive planning regime and a detailed and uncertain process of development control linked to it. This paper presents the findings of empirical research on the time taken to gain planning permission for selected recent major housing projects from a sample of local authorities in southern England. The scale of delay found was far greater than is indicated by average official data measuring the extent to which local authorities meet planning delay targets. Hedonic analysis indicated that there is considerable variation in time it takes local authorities to process planning applications, with the worst being four times slower than the best. Smaller builders and housing association developments are processed more quickly than those of large developers and small sites appear to be particularly time intensive. These results suggest that delays in development control may be a significant contributory factor to the low responsiveness of UK housing supply to upturns in market activity.
Resumo:
It has been asserted that business reorganisation and new working practices are transforming the nature of demand for business space. Downsizing, delayering, business process reengineering and associated initiatives alter the amount, type and location of space required by firms. The literature has neglected the impact of real estate market structures on the ability of organisations to successfully implement these new organisational forms or contemporary working practices. Drawing from UK research, the paper demonstrates that, while new working practices are widespread, their impact on the corporate real estate portfolio is less dramatic than often supposed. In part, this is attributed to inflexibility in market structures which constrains the supply of appropriate space.
Resumo:
This paper examines some broad issues concerning the role that conservation policy plays in statutory planning in Britain. It argues that planning contains a number of different, often conflicting, objectives. Conservation, in contributing to one of these objectives, exacerbates this conflict. The paper further argues that since different objectives are accorded different priorities depending upon the prevailing political ideology, conservation policy is not only operating within the context of possibly opposing planning objectives, but also within a particular political environment which will separately determine the degree of importance attached to it. The British example is used to explore these themes, particularly in examining the ideological basis for the redefinition of preservation and protection away from their welfarist traditions towards issues of private rights and market supremacy. The paper concludes that rather than contributing to social welfare, planning and conservation policy is now contributing to the increasing division between rich and poor in society.
Resumo:
1. Species-based indices are frequently employed as surrogates for wider biodiversity health and measures of environmental condition. Species selection is crucial in determining an indicators metric value and hence the validity of the interpretation of ecosystem condition and function it provides, yet an objective process to identify appropriate indicator species is frequently lacking. 2. An effective indicator needs to (i) be representative, reflecting the status of wider biodiversity; (ii) be reactive, acting as early-warning systems for detrimental changes in environmental conditions; (iii) respond to change in a predictable way. We present an objective, niche-based approach for species' selection, founded on a coarse categorisation of species' niche space and key resource requirements, which ensures the resultant indicator has these key attributes. 3. We use UK farmland birds as a case study to demonstrate this approach, identifying an optimal indicator set containing 12 species. In contrast to the 19 species included in the farmland bird index (FBI), a key UK biodiversity indicator that contributes to one of the UK Government's headline indicators of sustainability, the niche space occupied by these species fully encompasses that occupied by the wider community of 62 species. 4. We demonstrate that the response of these 12 species to land-use change is a strong correlate to that of the wider farmland bird community. Furthermore, the temporal dynamics of the index based on their population trends closely matches the population dynamics of the wider community. However, in both analyses, the magnitude of the change in our indicator was significantly greater, allowing this indicator to act as an early-warning system. 5. Ecological indicators are embedded in environmental management, sustainable development and biodiversity conservation policy and practice where they act as metrics against which progress towards national, regional and global targets can be measured. Adopting this niche-based approach for objective selection of indicator species will facilitate the development of sensitive and representative indices for a range of taxonomic groups, habitats and spatial scales.
Resumo:
This paper explores the sensitivity of Atmospheric General Circulation Model (AGCM) simulations to changes in the meridional distribution of sea surface temperature (SST). The simulations are for an aqua-planet, a water covered Earth with no land, orography or sea-ice and with specified zonally symmetric SST. Simulations from 14 AGCMs developed for Numerical Weather Prediction and climate applications are compared. Four experiments are performed to study the sensitivity to the meridional SST profile. These profiles range from one in which the SST gradient continues to the equator to one which is flat approaching the equator, all with the same maximum SST at the equator. The zonal mean circulation of all models shows strong sensitivity to latitudinal distribution of SST. The Hadley circulation weakens and shifts poleward as the SST profile flattens in the tropics. One question of interest is the formation of a double versus a single ITCZ. There is a large variation between models of the strength of the ITCZ and where in the SST experiment sequence they transition from a single to double ITCZ. The SST profiles are defined such that as the equatorial SST gradient flattens, the maximum gradient increases and moves poleward. This leads to a weakening of the mid-latitude jet accompanied by a poleward shift of the jet core. Also considered are tropical wave activity and tropical precipitation frequency distributions. The details of each vary greatly between models, both with a given SST and in the response to the change in SST. One additional experiment is included to examine the sensitivity to an off-equatorial SST maximum. The upward branch of the Hadley circulation follows the SST maximum off the equator. The models that form a single precipitation maximum when the maximum SST is on the equator shift the precipitation maximum off equator and keep it centered over the SST maximum. Those that form a double with minimum on the equatorial maximum SST shift the double structure off the equator, keeping the minimum over the maximum SST. In both situations only modest changes appear in the shifted profile of zonal average precipitation. When the upward branch of the Hadley circulation moves into the hemisphere with SST maximum, the zonal average zonal, meridional and vertical winds all indicate that the Hadley cell in the other hemisphere dominates.
Resumo:
It is shown here that the angular relation equations between direct and reciprocal vectors are very similar to the angular relation equations in Euler's theorem. These two sets of equations are usually treated separately as unrelated equations in different fields. In this careful study, the connection between the two sets of angular equations is revealed by considering the cosine rule for the spherical triangle. It is found that understanding of the correlation is hindered by the facts that the same variables are defined differently and different symbols are used to represent them in the two fields. Understanding the connection between different concepts is not only stimulating and beneficial, but also a fundamental tool in innovation and research, and has historical significance. The background of the work presented here contains elements of many scientific disciplines. This work illustrates the common ground of two theories usually considered separately and is therefore of benefit not only for its own sake but also to illustrate a general principle that a theory relevant to one discipline can often be used in another. The paper works with chemistry related concepts using mathematical methodologies unfamiliar to the usual audience of mainstream experimental and theoretical chemists.
Resumo:
The crystal structure of the ruthenium DNA ‘light-switch’ complex -[Ru(TAP)2(11-Cl-dppz)]2+ (TAP = tetraazaphenanthrene, dppz = dipyrido[3,2-a':2',3'-c]phenazine)) bound to the oligonucleotide duplex d(TCGGCGCCGA)2 is reported. The synthesis of the racemic ruthenium complex is described for the first time, and the racemate was used in this study. The crystal structure, at atomic resolution (1.0 Å), shows one ligand as a wedge in the minor groove, resulting in the 51 kinking of the double helix, as with the parent lambda-[Ru(TAP)2(dppz)]2+. Each complex binds to one duplex by intercalation of the dppz ligand and also by semi-intercalation of one of the orthogonal TAP ligands into a second symmetrically equivalent duplex. The 11-Cl substituent binds with the major component (66%) oriented with the 11-chloro substituent on the purine side of the terminal step of the duplex.
Resumo:
The agility of inter-organizational process represents the ability of virtual enterprise to respond rapidly to the changing market environment. Many theories and methodologies about inter-organizational process have been developed but the dynamic agility has seldom been addressed. A virtual enterprise whose process has a high dynamic agility will be able to adjust with the changing environment in short time and low cost. This paper analyzes the agility of inter-organizational process from a dynamic perspective. Two indexes are proposed to evaluate the dynamic agility: time and cost. Furthermore, the method to measure the dynamic agility using simulation is studied. Finally, a case study is given to illustrate the method to measure the dynamic agility.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.