935 resultados para restriction of parameter space


Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing pace of change, organisations have sought new real estate solutions which provide greater flexibility. What appears to be required is not flexibility for all uses but appropriate flexibility for the volatile, risky and temporal part of a business. This is the essence of the idea behind the split between the core and periphery portfolio. The serviced office has emerged to fill the need for absolute flexibility. This market is very diverse in terms of the product, services and target market. It has grown and gained credibility with occupiers and more recently with the property investment market. Occupiers similarly use this space in a variety of ways. Some solely occupy serviced space while others use it to complement their more permanent space. It therefore appears that the market is fulfilling the role of providing periphery space for at least some of the occupiers. In all instances the key to this space is a focus on financial and tenurial flexibility which is not provided by other types of business space offered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses the implications of the shifting cultural significance of public open space in urban areas. In particular, it focuses on the increasing dysfunction between people's expectations of that space and its actual provision and management. In doing so, the paper applies Lefebvre's ideas of spatiality to the evident paradigm shift from 'public' to 'private' culture, with its associated commodification of previously public space. While developing the construct of paradigm shift, the paper recognises that the former political notions inherent in the provision of public space remain in evidence. So whereas public parks were formerly seen as spaces of confrontation between the 'rationality' of public order as the 'irrationality' of individual leisure pursuits, they are now increasingly seen, particularly 'out of hours', as the domain of the dispossessed, to be defined and policed as 'dangerous'. Where once people were welcomed into public open spaces as a means of 'educating' them in good, acceptable, leisure practices, therefore, they are now increasingly excluded, but for the same ostensible reasons. Building on survey work undertaken in Reading, Berkshire, the paper illustrates how communities can become separated from 'their' space, leaving them with the overriding impression that they have been 'short-changed' in terms of both the provision and the management of urban open space. Rather than the intimacy of local space for local people, therefore, the paper argues that parks have become externalised places, increasingly responding to commercial definitions of culture and what is 'public'. Central urban open spaces are therefore increasingly becoming sites of stratification, signification of a consumer-constructed citizenship and valorisation of public life as a legitimate element of the market surface of town and city centres.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper examines the growing trend in the UK towards the effective privatisation of formerly public open space and the relationship of this trend to the recent shifts in public sector management. A case study of Reading, England, illustrates the growing cultural and spatial dysfunction, particularly in terms of the declining knowledge and use of the town's urban gardens by the local population. Where once the gardens were a focus of social activity, therefore, they are now a largely irrelevant site of urban decline. In contrast to central urban space, it is clear that other types of open space in other areas can still assume a significance in peoples' lives. In many cases the use of these areas illustrates a counter cultural position in which the consumerism of the city management is actively being resisted. The paper concludes that while there appear to be ways in which local space could be reclaimed for local people, the power to achieve this lies predominantly in the same hands as those responsible for appropriating central space to the imperative of the market in the first instance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We embark upon a systematic investigation of operator space structure of JC*-triples via a study of the TROs (ternary rings of operators) they generate. Our approach is to introduce and develop a variety of universal objects, including universal TROs, by which means we are able to describe all possible operator space structures of a JC*-triple. Via the concept of reversibility we obtain characterisations of universal TROs over a wide range of examples. We apply our results to obtain explicit descriptions of operator space structures of Cartan factors regardless of dimension

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical methods of inference typically require the likelihood function to be computable in a reasonable amount of time. The class of “likelihood-free” methods termed Approximate Bayesian Computation (ABC) is able to eliminate this requirement, replacing the evaluation of the likelihood with simulation from it. Likelihood-free methods have gained in efficiency and popularity in the past few years, following their integration with Markov Chain Monte Carlo (MCMC) and Sequential Monte Carlo (SMC) in order to better explore the parameter space. They have been applied primarily to estimating the parameters of a given model, but can also be used to compare models. Here we present novel likelihood-free approaches to model comparison, based upon the independent estimation of the evidence of each model under study. Key advantages of these approaches over previous techniques are that they allow the exploitation of MCMC or SMC algorithms for exploring the parameter space, and that they do not require a sampler able to mix between models. We validate the proposed methods using a simple exponential family problem before providing a realistic problem from human population genetics: the comparison of different demographic models based upon genetic data from the Y chromosome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Undirected graphical models are widely used in statistics, physics and machine vision. However Bayesian parameter estimation for undirected models is extremely challenging, since evaluation of the posterior typically involves the calculation of an intractable normalising constant. This problem has received much attention, but very little of this has focussed on the important practical case where the data consists of noisy or incomplete observations of the underlying hidden structure. This paper specifically addresses this problem, comparing two alternative methodologies. In the first of these approaches particle Markov chain Monte Carlo (Andrieu et al., 2010) is used to efficiently explore the parameter space, combined with the exchange algorithm (Murray et al., 2006) for avoiding the calculation of the intractable normalising constant (a proof showing that this combination targets the correct distribution in found in a supplementary appendix online). This approach is compared with approximate Bayesian computation (Pritchard et al., 1999). Applications to estimating the parameters of Ising models and exponential random graphs from noisy data are presented. Each algorithm used in the paper targets an approximation to the true posterior due to the use of MCMC to simulate from the latent graphical model, in lieu of being able to do this exactly in general. The supplementary appendix also describes the nature of the resulting approximation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We model the thermal evolution of a subsurface ocean of aqueous ammonium sulfate inside Titan using a parameterized convection scheme. The cooling and crystallization of such an ocean depends on its heat flux balance, and is governed by the pressure-dependent melting temperatures at the top and bottom of the ocean. Using recent observations and previous experimental data, we present a nominal model which predicts the thickness of the ocean throughout the evolution of Titan; after 4.5 Ga we expect an aqueous ammonium sulfate ocean 56 km thick, overlain by a thick (176 km) heterogeneous crust of methane clathrate, ice I and ammonium sulfate. Underplating of the crust by ice I will give rise to compositional diapirs that are capable of rising through the crust and providing a mechanism for cryovolcanism at the surface. We have conducted a parameter space survey to account for possible variations in the nominal model, and find that for a wide range of plausible conditions, an ocean of aqueous ammonium sulfate can survive to the present day, which is consistent with the recent observations of Titan's spin state from Cassini radar data [Lorenz, R.D., Stiles, B.W., Kirk, R.L., Allison, M.D., del Marmo, P.P., Iess, L., Lunine, J.I., Ostro, S.J., Hensley, S., 2008. Science 319, 1649–1651].

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many physical systems exhibit dynamics with vastly different time scales. Often the different motions interact only weakly and the slow dynamics is naturally constrained to a subspace of phase space, in the vicinity of a slow manifold. In geophysical fluid dynamics this reduction in phase space is called balance. Classically, balance is understood by way of the Rossby number R or the Froude number F; either R ≪ 1 or F ≪ 1. We examined the shallow-water equations and Boussinesq equations on an f -plane and determined a dimensionless parameter _, small values of which imply a time-scale separation. In terms of R and F, ∈= RF/√(R^2+R^2 ) We then developed a unified theory of (extratropical) balance based on _ that includes all cases of small R and/or small F. The leading-order systems are ensured to be Hamiltonian and turn out to be governed by the quasi-geostrophic potential-vorticity equation. However, the height field is not necessarily in geostrophic balance, so the leading-order dynamics are more general than in quasi-geostrophy. Thus the quasi-geostrophic potential-vorticity equation (as distinct from the quasi-geostrophic dynamics) is valid more generally than its traditional derivation would suggest. In the case of the Boussinesq equations, we have found that balanced dynamics generally implies hydrostatic balance without any assumption on the aspect ratio; only when the Froude number is not small and it is the Rossby number that guarantees a timescale separation must we impose the requirement of a small aspect ratio to ensure hydrostatic balance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recently proposed mean-field theory of mammalian cortex rhythmogenesis describes the salient features of electrical activity in the cerebral macrocolumn, with the use of inhibitory and excitatory neuronal populations (Liley et al 2002). This model is capable of producing a range of important human EEG (electroencephalogram) features such as the alpha rhythm, the 40 Hz activity thought to be associated with conscious awareness (Bojak & Liley 2007) and the changes in EEG spectral power associated with general anesthetic effect (Bojak & Liley 2005). From the point of view of nonlinear dynamics, the model entails a vast parameter space within which multistability, pseudoperiodic regimes, various routes to chaos, fat fractals and rich bifurcation scenarios occur for physiologically relevant parameter values (van Veen & Liley 2006). The origin and the character of this complex behaviour, and its relevance for EEG activity will be illustrated. The existence of short-lived unstable brain states will also be discussed in terms of the available theoretical and experimental results. A perspective on future analysis will conclude the presentation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

FAMOUS fills an important role in the hierarchy of climate models, both explicitly resolving atmospheric and oceanic dynamics yet being sufficiently computationally efficient that either very long simulations or large ensembles are possible. An improved set of carbon cycle parameters for this model has been found using a perturbed physics ensemble technique. This is an important step towards building the "Earth System" modelling capability of FAMOUS, which is a reduced resolution, and hence faster running, version of the Hadley Centre Climate model, HadCM3. Two separate 100 member perturbed parameter ensembles were performed; one for the land surface and one for the ocean. The land surface scheme was tested against present-day and past representations of vegetation and the ocean ensemble was tested against observations of nitrate. An advantage of using a relatively fast climate model is that a large number of simulations can be run and hence the model parameter space (a large source of climate model uncertainty) can be more thoroughly sampled. This has the associated benefit of being able to assess the sensitivity of model results to changes in each parameter. The climatologies of surface and tropospheric air temperature and precipitation are improved relative to previous versions of FAMOUS. The improved representation of upper atmosphere temperatures is driven by improved ozone concentrations near the tropopause and better upper level winds.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An ability to quantify the reliability of probabilistic flood inundation predictions is a requirement not only for guiding model development but also for their successful application. Probabilistic flood inundation predictions are usually produced by choosing a method of weighting the model parameter space, but previous study suggests that this choice leads to clear differences in inundation probabilities. This study aims to address the evaluation of the reliability of these probabilistic predictions. However, a lack of an adequate number of observations of flood inundation for a catchment limits the application of conventional methods of evaluating predictive reliability. Consequently, attempts have been made to assess the reliability of probabilistic predictions using multiple observations from a single flood event. Here, a LISFLOOD-FP hydraulic model of an extreme (>1 in 1000 years) flood event in Cockermouth, UK, is constructed and calibrated using multiple performance measures from both peak flood wrack mark data and aerial photography captured post-peak. These measures are used in weighting the parameter space to produce multiple probabilistic predictions for the event. Two methods of assessing the reliability of these probabilistic predictions using limited observations are utilized; an existing method assessing the binary pattern of flooding, and a method developed in this paper to assess predictions of water surface elevation. This study finds that the water surface elevation method has both a better diagnostic and discriminatory ability, but this result is likely to be sensitive to the unknown uncertainties in the upstream boundary condition

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Support vector machines (SVMs) were originally formulated for the solution of binary classification problems. In multiclass problems, a decomposition approach is often employed, in which the multiclass problem is divided into multiple binary subproblems, whose results are combined. Generally, the performance of SVM classifiers is affected by the selection of values for their parameters. This paper investigates the use of genetic algorithms (GAs) to tune the parameters of the binary SVMs in common multiclass decompositions. The developed GA may search for a set of parameter values common to all binary classifiers or for differentiated values for each binary classifier. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a 2D parameter space, by using nine experimental time series of a Clitia`s circuit, we characterized three codimension-1 chaotic fibers parallel to a period-3 window. To show the local preservation of the properties of the chaotic attractors in each fiber, we applied the closed return technique and two distinct topological methods. With the first topological method we calculated the linking, numbers in the sets of unstable periodic orbits, and with the second one we obtained the symbolic planes and the topological entropies by applying symbolic dynamic analysis. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Models of dynamical dark energy unavoidably possess fluctuations in the energy density and pressure of that new component. In this paper we estimate the impact of dark energy fluctuations on the number of galaxy clusters in the Universe using a generalization of the spherical collapse model and the Press-Schechter formalism. The observations we consider are several hypothetical Sunyaev-Zel`dovich and weak lensing (shear maps) cluster surveys, with limiting masses similar to ongoing (SPT, DES) as well as future (LSST, Euclid) surveys. Our statistical analysis is performed in a 7-dimensional cosmological parameter space using the Fisher matrix method. We find that, in some scenarios, the impact of these fluctuations is large enough that their effect could already be detected by existing instruments such as the South Pole Telescope, when priors from other standard cosmological probes are included. We also show how dark energy fluctuations can be a nuisance for constraining cosmological parameters with cluster counts, and point to a degeneracy between the parameter that describes dark energy pressure on small scales (the effective sound speed) and the parameters describing its equation of state.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review some issues related to the implications of different missing data mechanisms on statistical inference for contingency tables and consider simulation studies to compare the results obtained under such models to those where the units with missing data are disregarded. We confirm that although, in general, analyses under the correct missing at random and missing completely at random models are more efficient even for small sample sizes, there are exceptions where they may not improve the results obtained by ignoring the partially classified data. We show that under the missing not at random (MNAR) model, estimates on the boundary of the parameter space as well as lack of identifiability of the parameters of saturated models may be associated with undesirable asymptotic properties of maximum likelihood estimators and likelihood ratio tests; even in standard cases the bias of the estimators may be low only for very large samples. We also show that the probability of a boundary solution obtained under the correct MNAR model may be large even for large samples and that, consequently, we may not always conclude that a MNAR model is misspecified because the estimate is on the boundary of the parameter space.