892 resultados para Development Models, territory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Expectations of future market conditions are acknowledged to be crucial for the development decision and hence for shaping the built environment. The purpose of this paper is to study the central London office market from 1987 to 2009 and test for evidence of rational, adaptive and naive expectations. Design/methodology/approach – Two parallel approaches are applied to test for either rational or adaptive/naive expectations: vector auto-regressive (VAR) approach with Granger causality tests and recursive OLS regression with one-step forecasts. Findings – Applying VAR models and a recursive OLS regression with one-step forecasts, the authors do not find evidence of adaptive and naïve expectations of developers. Although the magnitude of the errors and the length of time lags between market signal and construction starts vary over time and development cycles, the results confirm that developer decisions are explained, to a large extent, by contemporaneous and historic conditions in both the City and the West End, but this is more likely to stem from the lengthy design, financing and planning permission processes rather than adaptive or naive expectations. Research limitations/implications – More generally, the results of this study suggest that real estate cycles are largely generated endogenously rather than being the result of large demand shocks and/or irrational behaviour. Practical implications – Developers may be able to generate excess profits by exploiting market inefficiencies but this may be hindered in practice by the long periods necessary for planning and construction of the asset. Originality/value – This paper focuses the scholarly debate of real estate cycles on the role of expectations. It is also one of very few spatially disaggregate studies of the subject matter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using a literature review, we argue that new models of peatland development are needed. Many existing models do not account for potentially important ecohydrological feedbacks, and/or ignore spatial structure and heterogeneity. Existing models, including those that simulate a near total loss of the northern peatland carbon store under a warming climate, may produce misleading results because they rely upon oversimplified representations of ecological and hydrological processes. In this, the first of a pair of papers, we present the conceptual framework for a model of peatland development, DigiBog, which considers peatlands as complex adaptive systems. DigiBog accounts for the interactions between the processes which govern litter production and peat decay, peat soil hydraulic properties, and peatland water-table behaviour, in a novel and genuinely ecohydrological manner. DigiBog consists of a number of interacting submodels, each representing a different aspect of peatland ecohydrology. Here we present in detail the mathematical and computational basis, as well as the implementation and testing, of the hydrological submodel. Remaining submodels are described and analysed in the accompanying paper. Tests of the hydrological submodel against analytical solutions for simple aquifers were highly successful: the greatest deviation between DigiBog and the analytical solutions was 2·83%. We also applied the hydrological submodel to irregularly shaped aquifers with heterogeneous hydraulic properties—situations for which no analytical solutions exist—and found the model's outputs to be plausible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Novel imaging techniques are playing an increasingly important role in drug development, providing insight into the mechanism of action of new chemical entities. The data sets obtained by these methods can be large with complex inter-relationships, but the most appropriate statistical analysis for handling this data is often uncertain - precisely because of the exploratory nature of the way the data are collected. We present an example from a clinical trial using magnetic resonance imaging to assess changes in atherosclerotic plaques following treatment with a tool compound with established clinical benefit. We compared two specific approaches to handle the correlations due to physical location and repeated measurements: two-level and four-level multilevel models. The two methods identified similar structural variables, but higher level multilevel models had the advantage of explaining a greater proportion of variation, and the modeling assumptions appeared to be better satisfied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Design/methodology/approach – Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. Findings – It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models. Evidence is found of equifinality in the outputs of a simple, aggregated model of development viability relative to more complex, disaggregated models. Originality/value – Development viability appraisal has become increasingly important in the planning system. Consequently, the theory, application and outputs from development appraisal are under intense scrutiny from a wide range of users. However, there has been very little published evaluation of viability models. This paper contributes to the limited literature in this area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Countries throughout the sub-Saharan (SSA) region have a complex linguistic heritage having their origins in opportunistic boundary changes effected by Western colonial powers at the Berlin Conference 1884-85. Postcolonial language-in-education policies valorizing ex-colonial languages have contributed at least in part to underachievement in education and thus the underdevelopment of human resources in SSA countries. This situation is not likely to improve whilst unresolved questions concerning the choice of language(s) that would best support social and economic development remain. Whilst policy attempts to develop local languages have been discussed within the framework of the African Union, and some countries have experimented with models of multilingual education during the past decade, the goalposts have already changed as a result of migration and trade. This paper argues that language policy makers need to be cognizant of changing language ecologies and their relationship with emerging linguistic and economic markets. The concept of language, within such a framework, has to be viewed in relation to the multiplicity of language markets within the shifting landscapes of people, culture, economics and the geo-politics of the 21st Century. Whilst, on the one hand, this refers to the hegemony of dominant powerful languages and the social relations of disempowerment, on the other hand, it also refers to existing and evolving social spaces and local language capabilities and choices. Within this framework the article argues that socially constructed dominant macro language markets need to be viewed also in relation to other, self-defined, community meso- and individual micro- language markets and their possibilities for social, economic and political development. It is through pursuing this argument that this article assesses the validity of Omoniyi’s argument in this volume, for the need to focus on the concept of language capital within multilingual contexts in the SSA region as compared to Bourdieu’s concept of linguistic capital.  

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular orientation parameters have been measured for the non-crystalline component of crosslinked natural rubber samples deformed in uniaxial tension as a function of the extension ratio and of temperature. The orientation parapeters 〈P2(cosα)〉 and 〈P4(cosα)〉 were obtained by an analysis of the anisotropy of the wide-angle X-ray scattering functions. For the measurements made at high temperatures the level of crystallinity detected was negligible and the orientation-strain behaviour could be compared directly with the predictions of molecular models of rubber elasticity. The molecular orientation behaviour with strain was found to be at variance with the estimates of the affine model particularly at low and moderate strains. Extension of the crosslinked rubber at room temperature led to strain-crystallization and measurements of both the molecular orientation of the non-crystalline chains and the degree of crystallinity during extension and relaxation enabled the role of the crystallites in the deformation process to be considered in detail. The intrinsic birefringence of the non-crystalline component was estimated, through the use of the 〈P2(cosα)〉 values obtained from X-ray scattering measurements, to be 0.20±0.02.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, that simple and simplistic models may produce similar outputs to more robust and disaggregated models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, simple and simplistic models may produce similar outputs to more robust and disaggregated models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presence of resident Langerhans cells (LCs) in the epidermis makes the skin an attractive target for DNA vaccination. However, reliable animal models for cutaneous vaccination studies are limited. We demonstrate an ex vivo human skin model for cutaneous DNA vaccination which can potentially bridge the gap between pre-clinical in vivo animal models and clinical studies. Cutaneous transgene expression was utilised to demonstrate epidermal tissue viability in culture. LC response to the culture environment was monitored by immunohistochemistry. Full-thickness and split-thickness skin remained genetically viable in culture for at least 72 h in both phosphate-buffered saline (PBS) and full organ culture medium (OCM). The epidermis of explants cultured in OCM remained morphologically intact throughout the culture duration. LCs in full-thickness skin exhibited a delayed response (reduction in cell number and increase in cell size) to the culture conditions compared with split-thickness skin, whose response was immediate. In conclusion, excised human skin can be cultured for a minimum of 72 h for analysis of gene expression and immune cell activation. However, the use of split-thickness skin for vaccine formulation studies may not be appropriate because of the nature of the activation. Full-thickness skin explants are a more suitable model to assess cutaneous vaccination ex vivo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Corneal tissue engineering has improved dramatically over recent years. It is now possible to apply these technological advancements to the development of superior in vitro ocular surface models to reduce animal testing. We aim to show the effect different substrates can have on the viability of expanded corneal epithelial cells and that those which more accurately mimic the stromal surface provide the most protection against toxic assault. Compressed collagen gel as a substrate for the expansion of a human epithelial cell line was compared against two well-known substrates for modeling the ocular surface (polycarbonate membrane and conventional collagen gel). Cells were expanded over 10 days at which point cell stratification, cell number and expression of junctional proteins were assessed by electron microscopy, immunohistochemistry and RT-PCR. The effect of increasing concentrations of sodium lauryl sulphate on epithelial cell viability was quantified by MTT assay. Results showed improvement in terms of stratification, cell number and tight junction expression in human epithelial cells expanded upon either the polycarbonate membrane or compressed collagen gel when compared to a the use of a conventional collagen gel. However, cell viability was significantly higher in cells expanded upon the compressed collagen gel. We conclude that the more naturalistic composition and mechanical properties of compressed collagen gels produces a more robust corneal model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of root system growth emerged in the early 1970s, and were based on mathematical representations of root length distribution in soil. The last decade has seen the development of more complex architectural models and the use of computer-intensive approaches to study developmental and environmental processes in greater detail. There is a pressing need for predictive technologies that can integrate root system knowledge, scaling from molecular to ensembles of plants. This paper makes the case for more widespread use of simpler models of root systems based on continuous descriptions of their structure. A new theoretical framework is presented that describes the dynamics of root density distributions as a function of individual root developmental parameters such as rates of lateral root initiation, elongation, mortality, and gravitropsm. The simulations resulting from such equations can be performed most efficiently in discretized domains that deform as a result of growth, and that can be used to model the growth of many interacting root systems. The modelling principles described help to bridge the gap between continuum and architectural approaches, and enhance our understanding of the spatial development of root systems. Our simulations suggest that root systems develop in travelling wave patterns of meristems, revealing order in otherwise spatially complex and heterogeneous systems. Such knowledge should assist physiologists and geneticists to appreciate how meristem dynamics contribute to the pattern of growth and functioning of root systems in the field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The estimation of the long-term wind resource at a prospective site based on a relatively short on-site measurement campaign is an indispensable task in the development of a commercial wind farm. The typical industry approach is based on the measure-correlate-predict �MCP� method where a relational model between the site wind velocity data and the data obtained from a suitable reference site is built from concurrent records. In a subsequent step, a long-term prediction for the prospective site is obtained from a combination of the relational model and the historic reference data. In the present paper, a systematic study is presented where three new MCP models, together with two published reference models �a simple linear regression and the variance ratio method�, have been evaluated based on concurrent synthetic wind speed time series for two sites, simulating the prospective and the reference site. The synthetic method has the advantage of generating time series with the desired statistical properties, including Weibull scale and shape factors, required to evaluate the five methods under all plausible conditions. In this work, first a systematic discussion of the statistical fundamentals behind MCP methods is provided and three new models, one based on a nonlinear regression and two �termed kernel methods� derived from the use of conditional probability density functions, are proposed. All models are evaluated by using five metrics under a wide range of values of the correlation coefficient, the Weibull scale, and the Weibull shape factor. Only one of all models, a kernel method based on bivariate Weibull probability functions, is capable of accurately predicting all performance metrics studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Microporous carbons are important in a wide variety of applications, ranging from pollution control to supercapacitors, yet their structure at the molecular level is poorly understood. Over the years, many structural models have been put forward, but none have been entirely satisfactory in explaining the properties of the carbons. The discovery of fullerenes and fullerene-related structures such as carbon nanotubes gave us a new perspective on the structure of solid carbon, and in 1997 it was suggested that microporous carbon may have a structure related to that of the fullerenes. Recently, evidence in support of such a structure has been obtained using aberration-corrected transmission electron microscopy, electron energy loss spectroscopy and other techniques. This article describes the development of ideas about the structure of microporous carbon, and reviews the experimental evidence for a fullerene-related structure. Theoretical models of the structural evolution of microporous carbon are summarised, and the use of fullerene-like models to predict the adsorptive properties of microporous carbons are reviewed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We review and structure some of the mathematical and statistical models that have been developed over the past half century to grapple with theoretical and experimental questions about the stochastic development of aging over the life course. We suggest that the mathematical models are in large part addressing the problem of partitioning the randomness in aging: How does aging vary between individuals, and within an individual over the lifecourse? How much of the variation is inherently related to some qualities of the individual, and how much is entirely random? How much of the randomness is cumulative, and how much is merely short-term flutter? We propose that recent lines of statistical inquiry in survival analysis could usefully grapple with these questions, all the more so if they were more explicitly linked to the relevant mathematical and biological models of aging. To this end, we describe points of contact among the various lines of mathematical and statistical research. We suggest some directions for future work, including the exploration of information-theoretic measures for evaluating components of stochastic models as the basis for analyzing experiments and anchoring theoretical discussions of aging.