20 resultados para Space use

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use store-specific data for a major UK supermarket chain to estimate the impact of planning on store output. Using the quasi-natural experiment of the variation in policies between England and other UK countries, we isolate the impact of Town Centre First policies. We find that space contributes directly to store productivity; and planning policies in England directly reduce output both by reducing store sizes and forcing stores onto less productive sites. We estimate that since the late 1980s planning policies have imposed a loss of output of at least 18.3 to 24.9% - more than a “lost decade’s” growth. JEL codes: D2, L51, L81, R32.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in the statistical analysis of compositional data over the last twodecades have made possible a much deeper exploration of the nature of variability,and the possible processes associated with compositional data sets from manydisciplines. In this paper we concentrate on geochemical data sets. First we explainhow hypotheses of compositional variability may be formulated within the naturalsample space, the unit simplex, including useful hypotheses of subcompositionaldiscrimination and specific perturbational change. Then we develop through standardmethodology, such as generalised likelihood ratio tests, statistical tools to allow thesystematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require specialconstruction. We comment on the use of graphical methods in compositional dataanalysis and on the ordination of specimens. The recent development of the conceptof compositional processes is then explained together with the necessary tools for astaying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland.Finally we point out a number of unresolved problems in the statistical analysis ofcompositional processes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Viruses rapidly evolve, and HIV in particular is known to be one of the fastest evolving human viruses. It is now commonly accepted that viral evolution is the cause of the intriguing dynamics exhibited during HIV infections and the ultimate success of the virus in its struggle with the immune system. To study viral evolution, we use a simple mathematical model of the within-host dynamics of HIV which incorporates random mutations. In this model, we assume a continuous distribution of viral strains in a one-dimensional phenotype space where random mutations are modelled by di ffusion. Numerical simulations show that random mutations combined with competition result in evolution towards higher Darwinian fitness: a stable traveling wave of evolution, moving towards higher levels of fi tness, is formed in the phenoty space.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2000 the European Statistical Office published the guidelines for developing theHarmonized European Time Use Surveys system. Under such a unified framework,the first Time Use Survey of national scope was conducted in Spain during 2002–03. The aim of these surveys is to understand human behavior and the lifestyle ofpeople. Time allocation data are of compositional nature in origin, that is, they aresubject to non-negativity and constant-sum constraints. Thus, standard multivariatetechniques cannot be directly applied to analyze them. The goal of this work is toidentify homogeneous Spanish Autonomous Communities with regard to the typicalactivity pattern of their respective populations. To this end, fuzzy clustering approachis followed. Rather than the hard partitioning of classical clustering, where objects areallocated to only a single group, fuzzy method identify overlapping groups of objectsby allowing them to belong to more than one group. Concretely, the probabilistic fuzzyc-means algorithm is conveniently adapted to deal with the Spanish Time Use Surveymicrodata. As a result, a map distinguishing Autonomous Communities with similaractivity pattern is drawn.Key words: Time use data, Fuzzy clustering; FCM; simplex space; Aitchison distance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We focus on full-rate, fast-decodable space–time block codes (STBCs) for 2 x 2 and 4 x 2 multiple-input multiple-output (MIMO) transmission. We first derive conditions and design criteria for reduced-complexity maximum-likelihood (ML) decodable 2 x 2 STBCs, and we apply them to two families of codes that were recently discovered. Next, we derive a novel reduced-complexity 4 x 2 STBC, and show that it outperforms all previously known codes with certain constellations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 2×2 MIMO profiles included in Mobile WiMAX specifications are Alamouti’s space-time code (STC) fortransmit diversity and spatial multiplexing (SM). The former hasfull diversity and the latter has full rate, but neither of them hasboth of these desired features. An alternative 2×2 STC, which is both full rate and full diversity, is the Golden code. It is the best known 2×2 STC, but it has a high decoding complexity. Recently, the attention was turned to the decoder complexity, this issue wasincluded in the STC design criteria, and different STCs wereproposed. In this paper, we first present a full-rate full-diversity2×2 STC design leading to substantially lower complexity ofthe optimum detector compared to the Golden code with only a slight performance loss. We provide the general optimized form of this STC and show that this scheme achieves the diversitymultiplexing frontier for square QAM signal constellations. Then, we present a variant of the proposed STC, which provides a further decrease in the detection complexity with a rate reduction of 25% and show that this provides an interesting trade-off between the Alamouti scheme and SM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiple-input multiple-output (MIMO) techniques have become an essential part of broadband wireless communications systems. For example, the recently developed IEEE 802.16e specifications for broadband wireless access include three MIMOprofiles employing 2×2 space-time codes (STCs), and two of these MIMO schemes are mandatory on the downlink of Mobile WiMAX systems. One of these has full rate, and the other has full diversity, but neither of them has both of the desired features. The third profile, namely, Matrix C, which is not mandatory, is both a full rate and a full diversity code, but it has a high decoder complexity. Recently, the attention was turned to the decodercomplexity issue and including this in the design criteria, several full-rate STCs were proposed as alternatives to Matrix C. In this paper, we review these different alternatives and compare them to Matrix C in terms of performances and the correspondingreceiver complexities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we present a hybrid approach for automatic summarization of Spanish medical texts. There are a lot of systems for automatic summarization using statistics or linguistics, but only a few of them combining both techniques. Our idea is that to reach a good summary we need to use linguistic aspects of texts, but as well we should benefit of the advantages of statistical techniques. We have integrated the Cortex (Vector Space Model) and Enertex (statistical physics) systems coupled with the Yate term extractor, and the Disicosum system (linguistics). We have compared these systems and afterwards we have integrated them in a hybrid approach. Finally, we have applied this hybrid system over a corpora of medical articles and we have evaluated their performances obtaining good results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of different kinds of nonlinear filtering in a joint transform correlator are studied and compared. The study is divided into two parts, one corresponding to object space and the second to the Fourier domain of the joint power spectrum. In the first part, phase and inverse filters are computed; their inverse Fourier transforms are also computed, thereby becoming the reference in the object space. In the Fourier space, the binarization of the power spectrum is realized and compared with a new procedure for removing the spatial envelope. All cases are simulated and experimentally implemented by a compact joint transform correlator.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The issue of de Sitter invariance for a massless minimally coupled scalar field is examined. Formally, it is possible to construct a de Sitterinvariant state for this case provided that the zero mode of the field is quantized properly. Here we take the point of view that this state is physically acceptable, in the sense that physical observables can be computed and have a reasonable interpretation. In particular, we use this vacuum to derive a new result: that the squared difference between the field at two points along a geodesic observers spacetime path grows linearly with the observers proper time for a quantum state that does not break de Sitter invariance. Also, we use the Hadamard formalism to compute the renormalized expectation value of the energy-momentum tensor, both in the O(4)-invariant states introduced by Allen and Follaci, and in the de Sitterinvariant vacuum. We find that the vacuum energy density in the O(4)-invariant case is larger than in the de Sitterinvariant case.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use the method of Bogolubov transformations to compute the rate of pair production by an electric field in (1+1)-dimensional de Sitter space. The results are in agreement with those obtained previously using the instanton methods. This is true even when the size of the instanton is comparable to the size of the de Sitter horizon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of existing protein sequences spans a very small fraction of sequence space. Natural proteins have overcome a strong negative selective pressure to avoid the formation of insoluble aggregates. Stably folded globular proteins and intrinsically disordered proteins (IDP) use alternative solutions to the aggregation problem. While in globular proteins folding minimizes the access to aggregation prone regions IDPs on average display large exposed contact areas. Here, we introduce the concept of average meta-structure correlation map to analyze sequence space. Using this novel conceptual view we show that representative ensembles of folded and ID proteins show distinct characteristics and responds differently to sequence randomization. By studying the way evolutionary constraints act on IDPs to disable a negative function (aggregation) we might gain insight into the mechanisms by which function - enabling information is encoded in IDPs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The SeDeM Diagram Expert System has been used to study excipients, Captopril and designed formulations for their galenic characterization and to ascertain the critical points of the formula affecting product quality to obtain suitable formulations of Captopril Direct Compression SR Matrix Tablets. The application of the Sedem Diagram Expert System enables selecting excipients with in order to optimize the formula in the preformulation and formulation studies. The methodology is based on the implementation of ICH Q8, establishing the design space of the formula with the use of experiment design, using the parameters of the SeDeM Diagram Expert System as system responses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper explores how wikis may be used to support primary education students’ collaborative interaction and how such an interaction process can be characterised. The overall aim of this study is to analyse the collaborative processes of students working together in a wiki environment, in order to see how primary students can actively create a shared context for learning in the wiki. Educational literature has already reported that wikis may support collaborative knowledge-construction processes, but in our study we claim that a dialogic perspective is needed to accomplish this. Students must develop an intersubjective orientation towards each others’ perspectives, to co-construct knowledge about a topic. For this purpose, our project utilised a ‘Thinking Together’ approach to help students develop an intersubjective orientation towards one another and to support the creation of a ‘dialogic space’ to co-construct new understanding in a wiki science project. The students’ asynchronous interaction process in a primary classroom -- which led to the creation of a science text in the wiki -- was analysed and characterised, using a dialogic approach to the study of CSCL practices. Our results illustrate how the Thinking Together approach became embedded within the wiki environment and in the students’ collaborative processes. We argue that a dialogic approach for examining interaction can be used to help design more effective pedagogic approaches related to the use of wikis in education and to equip learners with the competences they need to participate in the global knowledge-construction era.