980 resultados para Norm
Resumo:
In this paper we study boundedness of the convolution operator in different Lorentz spaces. In particular, we obtain the limit case of the Young-O'Neil inequality in the classical Lorentz spaces. We also investigate the convolution operator in the weighted Lorentz spaces. Finally, norm inequalities for the potential operator are presented.
Resumo:
The financial crisis, on the one hand, and the recourse to ‘unconventional’ monetary policy, on the other, have given a sharp jolt to perceptions of the role and status of central banks. In this paper we start with a brief ‘contrarian’ history of central banks since the second world war, which presents the Great Moderation and the restricted focus on inflation targeting as a temporary aberration from the norm. We then discuss how recent developments in fiscal and monetary policy have affected the role and status of central banks, notably their relationships with governments, before considering the environment central banks will face in the near and middle future and how they will have to change to address it.
Resumo:
Diffusion MRI is a well established imaging modality providing a powerful way to probe the structure of the white matter non-invasively. Despite its potential, the intrinsic long scan times of these sequences have hampered their use in clinical practice. For this reason, a large variety of methods have been recently proposed to shorten the acquisition times. Among them, spherical deconvolution approaches have gained a lot of interest for their ability to reliably recover the intra-voxel fiber configuration with a relatively small number of data samples. To overcome the intrinsic instabilities of deconvolution, these methods use regularization schemes generally based on the assumption that the fiber orientation distribution (FOD) to be recovered in each voxel is sparse. The well known Constrained Spherical Deconvolution (CSD) approach resorts to Tikhonov regularization, based on an ℓ(2)-norm prior, which promotes a weak version of sparsity. Also, in the last few years compressed sensing has been advocated to further accelerate the acquisitions and ℓ(1)-norm minimization is generally employed as a means to promote sparsity in the recovered FODs. In this paper, we provide evidence that the use of an ℓ(1)-norm prior to regularize this class of problems is somewhat inconsistent with the fact that the fiber compartments all sum up to unity. To overcome this ℓ(1) inconsistency while simultaneously exploiting sparsity more optimally than through an ℓ(2) prior, we reformulate the reconstruction problem as a constrained formulation between a data term and a sparsity prior consisting in an explicit bound on the ℓ(0)norm of the FOD, i.e. on the number of fibers. The method has been tested both on synthetic and real data. Experimental results show that the proposed ℓ(0) formulation significantly reduces modeling errors compared to the state-of-the-art ℓ(2) and ℓ(1) regularization approaches.
Resumo:
In this paper we assume that for some commodities individuals may wish to adjust their levels of consumption from their normal Marshallian levels so as to match the consumption levels of a group of other individuals, in order to signal that they conform to the consumption norms of that group. Unlike Veblen’s concept of conspicuous consumption this can mean that some individuals may reduce their consumption of the relevant commodities. We model this as a three-stage game in which individuals first decide whether or not they wish to adhere to a norm, then decide which norm they wish to adhere to, and finally decide their actual consumption. We present a number of examples of the resulting equilibria, and then discuss the potential policy implications of this model.
Resumo:
The relationship between the operator norms of fractional integral operators acting on weighted Lebesgue spaces and the constant of the weights is investigated. Sharp bounds are obtained for both the fractional integral operators and the associated fractional maximal functions. As an application improved Sobolev inequalities are obtained. Some of the techniques used include a sharp off-diagonal version of the extrapolation theorem of Rubio de Francia and characterizations of two-weight norm inequalities.
Resumo:
The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.
Resumo:
Treball de recerca realitzat per un alumne d'ensenyament secundari i guardonat amb un Premi CIRIT per fomentar l'esperit científic del Jovent l'any 2009. Aquest treball té com a base donar les pautes necessàries per a l'obtenció de biodièsel i determinar-ne la seva acceptabilitat. Per tal de que això sigui possible, s'han sintetitzat biodièsels provinents d'oli de girasol refinat, d'oli d'oliva refinat, i d'aquests mateixos fregits al emprar-los al cuinar. Un cop sintetitzats se'ls hi ha realitzat una sèrie d'anàlisis fisicoquímics per tal de determinar-ne la seva acceptabilitat mitjançant una comparativa amb els resultats obtinguts amb un dièsel comercial, així com amb els límits que marca la unió europea en l'esborrany final de la norma prEN 14214.
Resumo:
When dealing with sustainability we are concerned with the biophysical as well as the monetary aspects of economic and ecological interactions. This multidimensional approach requires that special attention is given to dimensional issues in relation to curve fitting practice in economics. Unfortunately, many empirical and theoretical studies in economics, as well as in ecological economics, apply dimensional numbers in exponential or logarithmic functions. We show that it is an analytical error to put a dimensional unit x into exponential functions ( a x ) and logarithmic functions ( x a log ). Secondly, we investigate the conditions of data sets under which a particular logarithmic specification is superior to the usual regression specification. This analysis shows that logarithmic specification superiority in terms of least square norm is heavily dependent on the available data set. The last section deals with economists’ “curve fitting fetishism”. We propose that a distinction be made between curve fitting over past observations and the development of a theoretical or empirical law capable of maintaining its fitting power for any future observations. Finally we conclude this paper with several epistemological issues in relation to dimensions and curve fitting practice in economics
Resumo:
2 Abstract2.1 En françaisLe séquençage du génome humain est un pré-requis fondamental à la compréhension de la biologie de l'être humain. Ce projet achevé, les scientifiques ont dû faire face à une tâche aussi importante, comprendre cette suite de 3 milliards de lettres qui compose notre génome. Le consortium ENCODE (ENCyclopedia Of Dna Elements) fût formé comme une suite logique au projet du génome humain. Son rôle est d'identifier tous les éléments fonctionnels de notre génome incluant les régions transcrites, les sites d'attachement des facteurs de transcription, les sites hypersensibles à la DNAse I ainsi que les marqueurs de modification des histones. Dans le cadre de ma thèse doctorale, j'ai participé à 2 sous-projets d'ENCODE. En premier lieu, j'ai eu la tâche de développer et d'optimiser une technique de validation expérimentale à haut rendement de modèles de gènes qui m'a permis d'estimer la qualité de la plus récente annotation manuelle. Ce nouveau processus de validation est bien plus efficace que la technique RNAseq qui est actuellement en train de devenir la norme. Cette technique basée sur la RT-PCR, m'a notamment permis de découvrir de nouveaux exons dans 10% des régions interrogées. En second lieu j'ai participé à une étude ayant pour but d'identifier les extrémités de tous les gènes des chromosomes humains 21 et 22. Cette étude à permis l'identification à large échelle de transcrits chimères comportant des séquences provenant de deux gènes distincts pouvant être à une grande distance l'un de autre.2.2 In EnglishThe completion of the human genome sequence js the prerequisite to fully understand the biology of human beings. This project achieved, scientists had to face another challenging task, understanding the meaning of the 3 billion letters composing this genome. As a logical continuation of the human genome project, the ENCODE (ENCyclopedia Of DNA Elements) consortium was formed with the aim of annotating all its functional elements. These elements include transcribed regions, transcription binding sites, DNAse I hypersensitive sites and histone modification marks. In the frame of my PhD thesis, I was involved in two sub-projects of ENCODE. Firstly I developed and optimized an high throughput method to validate gene models, which allowed me to assess the quality of the most recent manually-curated annotation. This novel experimental validation pipeline is extremely effective, far more so than transcriptome profiling through RNA sequencing, which is becoming the norm. This RT-PCR-seq targeted-approach is likewise particularly efficient in identifying novel exons, as we discovered about 10% of loci with unannotated exons. Secondly, I participated to a study aiming to identify the gene boundaries of all genes in the human chromosome 21 and 22. This study led to the identification of chimeric transcripts that are composed of sequences coming form two distinct genes that can be map far away from each other.
Resumo:
AIMS: In patients with alcohol dependence, health-related quality of life (QOL) is reduced compared with that of a normal healthy population. The objective of the current analysis was to describe the evolution of health-related QOL in adults with alcohol dependence during a 24-month period after initial assessment for alcohol-related treatment in a routine practice setting, and its relation to drinking pattern which was evaluated across clusters based on the predominant pattern of alcohol use, set against the influence of baseline variables METHODS: The Medical Outcomes Study 36-Item Short-Form Survey (MOS-SF-36) was used to measure QOL at baseline and quarterly for 2 years among participants in CONTROL, a prospective observational study of patients initiating treatment for alcohol dependence. The sample consisted of 160 adults with alcohol dependence (65.6% males) with a mean (SD) age of 45.6 (12.0) years. Alcohol use data were collected using TimeLine Follow-Back. Based on the participant's reported alcohol use, three clusters were identified: 52 (32.5%) mostly abstainers, 64 (40.0%) mostly moderate drinkers and 44 (27.5%) mostly heavy drinkers. Mixed-effect linear regression analysis was used to identify factors that were potentially associated with the mental and physical summary MOS-SF-36 scores at each time point. RESULTS: The mean (SD) MOS-SF-36 mental component summary score (range 0-100, norm 50) was 35.7 (13.6) at baseline [mostly abstainers: 40.4 (14.6); mostly moderate drinkers 35.6 (12.4); mostly heavy drinkers 30.1 (12.1)]. The score improved to 43.1 (13.4) at 3 months [mostly abstainers: 47.4 (12.3); mostly moderate drinkers 44.2 (12.7); mostly heavy drinkers 35.1 (12.9)], to 47.3 (11.4) at 12 months [mostly abstainers: 51.7 (9.7); mostly moderate drinkers 44.8 (11.9); mostly heavy drinkers 44.1 (11.3)], and to 46.6 (11.1) at 24 months [mostly abstainers: 49.2 (11.6); mostly moderate drinkers 45.7 (11.9); mostly heavy drinkers 43.7 (8.8)]. Mixed-effect linear regression multivariate analyses indicated that there was a significant association between a lower 2-year follow-up MOS-SF-36 mental score and being a mostly heavy drinker (-6.97, P < 0.001) or mostly moderate drinker (-3.34 points, P = 0.018) [compared to mostly abstainers], being female (-3.73, P = 0.004), and having a Beck Inventory scale score ≥8 (-6.54, P < 0.001), at baseline. The mean (SD) MOS-SF-36 physical component summary score was 48.8 (10.6) at baseline, remained stable over the follow-up and did not differ across the three clusters. Mixed-effect linear regression univariate analyses found that the average 2-year follow-up MOS-SF-36 physical score was increased (compared with mostly abstainers) in mostly heavy drinkers (+4.44, P = 0.007); no other variables tested influenced the MOS-SF-36 physical score. CONCLUSION: Among individuals with alcohol dependence, a rapid improvement was seen in the mental dimension of QOL following treatment initiation, which was maintained during 24 months. Improvement was associated with the pattern of alcohol use, becoming close to the general population norm in patients classified as mostly abstainers, improving substantially in mostly moderate drinkers and improving only slightly in mostly heavy drinkers. The physical dimension of QOL was generally in the normal range but was not associated with drinking patterns.
Resumo:
El objetivo de la presente investigación es analizar el tratamiento que algunos de los diccionarios generales monolingües del español aparecidos en los últimos diez años han dado al fenómeno de las colocaciones léxicas, a saber, aquellas combinaciones de palabras que, desde el punto de vista de la norma, presentan ciertas restricciones combinatorias, esencialmente de carácter semántico, impuestas por el uso (Corpas: 1996). Los repertorios objeto de análisis han sido: el "Diccionario Salamanca de la lengua española", dirigido por Juan Gutiérrez (1996); el "Diccionario del español actual", de Manuel Seco, Olimpia Andrés y Gabino Ramos (1999); la vigésima segunda edición del diccionario de la RAE (2001); y el "Gran diccionario de uso del español actual. Basado en el corpus Cumbre", dirigido por Aquilino Sánchez (2001). Nuestro estudio se ha fundamentado en un corpus de 52 colocaciones léxicas confeccionado a partir del análisis de las subentradas contenidas en la letra "b" de cada uno de los diccionarios seleccionados. Posteriormente, hemos examinado las entradas correspondientes a cada uno de los elementos que constituyen la colocación (base y colocativo) con el fin de observar si los diccionarios estudiados dan cuenta de estas mismas combinaciones en otras partes del artículo lexicográfico, como son las definiciones o los ejemplos. A la hora de analizar la información lexicográfica hemos centrado nuestra atención en cuatro aspectos: a) la información contenida en las páginas preliminares de cada una de las obras; b) la ubicación de las colocaciones en el artículo lexicográfico; c) la asignación de la colocación a un artículo determinado; y d) la marcación gramatical.
Resumo:
El nostre objectiu es l'estudi d'extensions de la Relativitat General i, en particular, estem interessats en les teories que continguin camps vectorials addicionals. En aquests tipus de teories es necessari imposar que el vector ha de tenir norma fixa per evitar la presència d'un fantasma o grau de llibertat amb terme cinètic negatiu, i això implica que la simetria Lorentz està trencada espontàniament. El camp del aether només interactua gravitatòriament i la seva presència es difícil de detectar, no obstant això, durant inflació les fluctuacions del buit a escales petites d'un camp lleuger pot deixar una empremta en observables com les anisotropies del fons de radiació de microones. Les fluctuacions del Einstein-aether es comporten com els camps sense massa i això fa que inflació generi modes de longitud de ona llarga en els sectors escalar i vectorial. Hem estudiat la signatura del Einstein-aether dins l'espectre de pertorbacions primordials lluny del límit de de Sitter de inflació. Aquests modes escalars i vectorials poden deixar una empremta significativa en la radiació de fons de microones en funció dels paràmetres del model. Les observacions del fons de radiació de microones imposen restriccions fenomenològiques que redueixen els límits existents per aquesta classe de teoria. Amb aquest estudi del aether també esperem millorar el coneixement que tenim de una classe més ampla de teories que exhibeixen el mateix tipus de trencament de simetria.
Resumo:
La validació de mètodes és un dels pilars fonamentals de l’assegurament de la qualitat en els laboratoris d’anàlisi, tal i com queda reflectit en la norma ISO/IEC 17025. És, per tant, un aspecte que cal abordar en els plans d’estudis dels presents i dels futurs graus en Química. Existeix molta bibliografia relativa a la validació de mètodes, però molt sovint aquesta s’utilitza poc, degut a la dificultat manifesta de processar tota la informació disponible i aplicar-la al laboratori i als problemes concrets. Una altra de les limitacions en aquest camps és la manca de programaris adaptats a les necessitats del laboratori. Moltes de les rutines estadístiques que es fan servir en la validació de mètodes són adaptacions fetes amb Microsoft Excel o venen incorporades en paquets estadístics gegants, amb un alt grau de complexitat. És per aquest motiu que l’objectiu del projecte ha estat generar un programari per la validació de mètodes i l’assegurament de la qualitat dels resultats analítics, que incorporés únicament les rutines necessàries. Específicament, el programari incorpora les funcions estadístiques necessàries per a verificar l’exactitud i avaluar la precisió d’un mètode analític. El llenguatge de programació triat ha estat el Java en la seva versió 6. La part de creació del programari ha constat de les següents etapes: recollida de requisits, anàlisi dels requisits, disseny del programari en mòduls, programació d les funcions del programa i de la interfície gràfica, creació de tests d’integració i prova amb usuaris reals, i, finalment, la posada en funcionament del programari (creació de l’instal·lador i distribució del programari).
Resumo:
The optimal size-to-age at maturity depends on growth and mortality rates, which vary with environment. Therefore, organisms in spatially or temporaly changing environments should develop adaptative phenotypic plasticity for this trait. Experimental work by Alm (1959) on several fish species shows a dome-shape norm of reaction for size-to-age at maturity: size at maturity is smaller in both fast-growing and slow-growing fishes, than it is in fish with a medium growth rate. Using computer simulations, we show that such a dome-shaped norm of reaction is optimal when assuming a finite life span and a negative relationship between production and survival rates. This latter assumption is supported by empirical data, as well as by physiological and emographic arguments.
Resumo:
La contaminación de suelos y aguas subterráneas es uno de los problemas ambientales más extendidos en gran parte de los terrenos industriales de Cataluña. En este proyecto se ha analizado el proceso de gestión de la contaminación: caracterización, remediación y seguimiento de la descontaminación en suelos y aguas subterráneas por un caso de afección por organoclorados (percloroetileno) y otros contaminantes (hidrocarburos, selenio y cromo) en un emplazamiento industrial situado en una zona agroforestal (superficie de 81.462 m2). A partir de la implantación en la empresa del sistema de gestión ISO 14.001 en 1.996, se abrieron diferentes proyectos de gestión para los posibles contaminantes. Por las mismas fechas, también se detectó una afección por selenio en aguas subterráneas, ajena a la empresa de estudio. Por el momento, el único contaminante que ha requerido de un proceso de descontaminación ha sido el percloroetileno. En suelos se emplea el método “soil vapor extraction” y en aguas subterráneas el método “airstripping”. Finalmente, se ha llevado a cabo una comparación de los costes reales derivados del proceso de descontaminación del percloroetileno en contra de los costes que se hubiesen derivado la implantación de medidas de prevención de la contaminación. El resultado de la valoración indica que la descontaminación de éste compuesto requiere de una inversión económica importante, unas 10 veces más elevada que los costes derivados de las medidas de prevención.