954 resultados para Laplace transform


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Given a compact Riemannian manifold $M$ of dimension $m \geq 2$, we study the space of functions of $L^2(M)$generated by eigenfunctions ofeigenvalues less than $L \geq 1$ associated to the Laplace-Beltrami operator on $M$. On these spaces we give a characterization of the Carleson measures and the Logvinenko-Sereda sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is very well known that the first succesful valuation of a stock option was done by solving a deterministic partial differential equation (PDE) of the parabolic type with some complementary conditions specific for the option. In this approach, the randomness in the option value process is eliminated through a no-arbitrage argument. An alternative approach is to construct a replicating portfolio for the option. From this viewpoint the payoff function for the option is a random process which, under a new probabilistic measure, turns out to be of a special type, a martingale. Accordingly, the value of the replicating portfolio (equivalently, of the option) is calculated as an expectation, with respect to this new measure, of the discounted value of the payoff function. Since the expectation is, by definition, an integral, its calculation can be made simpler by resorting to powerful methods already available in the theory of analytic functions. In this paper we use precisely two of those techniques to find the well-known value of a European call

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[spa] En un modelo de Poisson compuesto, definimos una estrategia de reaseguro proporcional de umbral : se aplica un nivel de retención k1 siempre que las reservas sean inferiores a un determinado umbral b, y un nivel de retención k2 en caso contrario. Obtenemos la ecuación íntegro-diferencial para la función Gerber-Shiu, definida en Gerber-Shiu -1998- en este modelo, que nos permite obtener las expresiones de la probabilidad de ruina y de la transformada de Laplace del momento de ruina para distintas distribuciones de la cuantía individual de los siniestros. Finalmente presentamos algunos resultados numéricos.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé : La microautophagie du noyau est un processus découvert chez la levure S. cerevisiae qui vise la dégradation de portions nucléaires dans la lumière vacuolaire. Ce processus appelé PMN (de l'anglais Piecemeal Microautophagy of the Nucleus) est induit dans des conditions de stress cellulaire comme la privation de nutriments, mais également par l'utilisation d'une drogue : la rapamycine. La PMN est due à l'interaction directe d'une protéine de la membrane externe de l'enveloppe nucléaire Nvj1p, et d'une protéine de la membrane vacuolaire Vac8p. L'interaction de ces deux protéines forme la jonction noyau-vacuole. Cette jonction guide la formation d'une invagination, qui englobe et étire vers la lumière vacuolaire une partie du noyau sous la forme d'un sac. Il s'en suit la libération d'une vésicule dégradée par les hydrolases. Les mécanismes moléculaires intervenant à différentes étapes de ce processus sont inconnus. Le but de ma thèse est de mettre en évidence de nouveaux acteurs qui interviennent dans la PMN. Dans la première partie de cette étude, nous présentons une procédure de sélection à la recherche de candidats jouant un rôle dans la PMN. Cette sélection a été effectuée dans la collection de mutants commercialisée chez Euroscarf. La procédure reposait sur l'observation que le nucléole (représenté par Nop1p) est le substrat préférentiel de la PMN dans des expériences de microscopie faites après induction de la PMN avec la rapamycine. Nous avons ainsi transformé la collection de mutants avec un plasmide portant le marqueur du nucléole Noplp. Par la suite, nous avons cherché par microscopie les mutants incapables de transférer Nop1p du noyau à la vacuole. Nous avons trouvé 318 gènes présentant un défaut de transfert de Nop1p par PMN. Ces gènes ont été classés par grandes familles fonctionnelles et aussi par leur degré de défaut de PMN. Egalement dans cette partie de l'étude, nous avons décrit des mutants impliqués dans le processus, à des étapes différentes. Dans la seconde partie de l'étude, nous avons regardé l'implication et le rôle de la V-ATPase, (une pompe à protons de la membrane vacuolaire}, sélectionnée parmi les candidats, dans le processus de PMN. Les inhibiteurs de ce complexe, comme la concanamycineA, bloquent l'activité PMN et semblent affecter le processus à deux étapes différentes. D'un autre côté, les jonctions «noyau-vacuole »forment une barrière de diffusion au niveau de la membrane vacuolaire, de laquelle Vphlp, une protéine de la V-ATPase, est exclue.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During my PhD, my aim was to provide new tools to increase our capacity to analyse gene expression patterns, and to study on a large-scale basis the evolution of gene expression in animals. Gene expression patterns (when and where a gene is expressed) are a key feature in understanding gene function, notably in development. It appears clear now that the evolution of developmental processes and of phenotypes is shaped both by evolution at the coding sequence level, and at the gene expression level.Studying gene expression evolution in animals, with complex expression patterns over tissues and developmental time, is still challenging. No tools are available to routinely compare expression patterns between different species, with precision, and on a large-scale basis. Studies on gene expression evolution are therefore performed only on small genes datasets, or using imprecise descriptions of expression patterns.The aim of my PhD was thus to develop and use novel bioinformatics resources, to study the evolution of gene expression. To this end, I developed the database Bgee (Base for Gene Expression Evolution). The approach of Bgee is to transform heterogeneous expression data (ESTs, microarrays, and in-situ hybridizations) into present/absent calls, and to annotate them to standard representations of anatomy and development of different species (anatomical ontologies). An extensive mapping between anatomies of species is then developed based on hypothesis of homology. These precise annotations to anatomies, and this extensive mapping between species, are the major assets of Bgee, and have required the involvement of many co-workers over the years. My main personal contribution is the development and the management of both the Bgee database and the web-application.Bgee is now on its ninth release, and includes an important gene expression dataset for 5 species (human, mouse, drosophila, zebrafish, Xenopus), with the most data from mouse, human and zebrafish. Using these three species, I have conducted an analysis of gene expression evolution after duplication in vertebrates.Gene duplication is thought to be a major source of novelty in evolution, and to participate to speciation. It has been suggested that the evolution of gene expression patterns might participate in the retention of duplicate genes. I performed a large-scale comparison of expression patterns of hundreds of duplicated genes to their singleton ortholog in an outgroup, including both small and large-scale duplicates, in three vertebrate species (human, mouse and zebrafish), and using highly accurate descriptions of expression patterns. My results showed unexpectedly high rates of de novo acquisition of expression domains after duplication (neofunctionalization), at least as high or higher than rates of partitioning of expression domains (subfunctionalization). I found differences in the evolution of expression of small- and large-scale duplicates, with small-scale duplicates more prone to neofunctionalization. Duplicates with neofunctionalization seemed to evolve under more relaxed selective pressure on the coding sequence. Finally, even with abundant and precise expression data, the majority fate I recovered was neither neo- nor subfunctionalization of expression domains, suggesting a major role for other mechanisms in duplicate gene retention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The identity [r]evolution is happening. Who are you, who am I in the information society? In recent years, the convergence of several factors - technological, political, economic - has accelerated a fundamental change in our networked world. On a technological level, information becomes easier to gather, to store, to exchange and to process. The belief that more information brings more security has been a strong political driver to promote information gathering since September 11. Profiling intends to transform information into knowledge in order to anticipate one's behaviour, or needs, or preferences. It can lead to categorizations according to some specific risk criteria, for example, or to direct and personalized marketing. As a consequence, new forms of identities appear. They are not necessarily related to our names anymore. They are based on information, on traces that we leave when we act or interact, when we go somewhere or just stay in one place, or even sometimes when we make a choice. They are related to the SIM cards of our mobile phones, to our credit card numbers, to the pseudonyms that we use on the Internet, to our email addresses, to the IP addresses of our computers, to our profiles... Like traditional identities, these new forms of identities can allow us to distinguish an individual within a group of people, or describe this person as belonging to a community or a category. How far have we moved through this process? The identity [r]evolution is already becoming part of our daily lives. People are eager to share information with their "friends" in social networks like Facebook, in chat rooms, or in Second Life. Customers take advantage of the numerous bonus cards that are made available. Video surveillance is becoming the rule. In several countries, traditional ID documents are being replaced by biometric passports with RFID technologies. This raises several privacy issues and might actually even result in changing the perception of the concept of privacy itself, in particular by the younger generation. In the information society, our (partial) identities become the illusory masks that we choose -or that we are assigned- to interplay and communicate with each other. Rights, obligations, responsibilities, even reputation are increasingly associated with these masks. On the one hand, these masks become the key to access restricted information and to use services. On the other hand, in case of a fraud or negative reputation, the owner of such a mask can be penalized: doors remain closed, access to services is denied. Hence the current preoccupying growth of impersonation, identity-theft and other identity-related crimes. Where is the path of the identity [r]evolution leading us? The booklet is giving a glance on possible scenarios in the field of identity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The emergence of chirality in enantioselective autocatalysis for compounds unable to transform according to the Frank-like reaction network is discussed with respect to the controversial limited enantioselectivity (LES) model composed of coupled enantioselective and non-enantioselective autocatalyses. The LES model cannot lead to spontaneous mirror symmetry breaking (SMSB) either in closed systems with a homogeneous temperature distribution or in closed systems with a stationary non-uniform temperature distribution. However, simulations of chemical kinetics in a two-compartment model demonstrate that SMSB may occur if both autocatalytic reactions are spatially separated at different temperatures in different compartments but coupled under the action of a continuous internal flow. In such conditions, the system can evolve, for certain reaction and system parameters, toward a chiral stationary state; that is, the system is able to reach a bifurcation point leading to SMSB. Numerical simulations in which reasonable chemical parameters have been used suggest that an ade- quate scenario for such a SMSB would be that of abyssal hydrothermal vents, by virtue of the typical temper- ature gradients found there and the role of inorganic solids mediating chemical reactions in an enzyme-like role. Key Words: Homochirality Prebiotic chemistry.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Después de la amplia exposición general que precede a nuestro estudio, no vamos a extendernos más en describir, en conjunto, los materiales de cada estrato. Nos hemos limitado a revisar a fondo todas y cada una de las piezas extraídas de la excavación; con anterioridad habían sido inventariadas y marcadas con un número correspondientes al fondo del Museo Arqueológico Provincial de Gerona, al que nosotros hacemos ahora referencia. Independientemente cada número de éstos abarca, en algunos casos, más de un útil, al aludir a un conjunto de piezas de una zona determinada, por lo que hemos procedido a marcar, a partir de 1, en cada caso, los útiles incluidos bajo una de las cifras de Gerona. En nuestra revisión hemos separado lo que ha sido considerado como útiles paleolíticos. La pobreza de estos materiales ha quedado ya suficientemente explicitada con anterioridad, tanto en su aspecto de utillaje como en el de la materia prima. En efecto, el total de útiles analizados, pertenecientes casi en su totalidad a los estratos III, IV Y V, es de 143. El método utilizado para la descripción del utillaje lítico ha sido el de la tipología analítica (Laplace, 1972). Somos plenamente conscientes de que su aplicación a una industria considerada como Paleolítico Medio es algo nuevo y que quizá sea aventurado, pero hemos de decir que la mera descripción de los útiles se puede enmarcar perfectamente en el campo de esta nueva metodología desde el punto de vista de una exposición objetiva de los retoques que conforman cada pieza. Nuestra decisión al decantarnos por la tipología analítica en nuestro estudio, es la de exponer simplemente las piezas, omitiendo al máximo las ambigüedades derivadas de las listas-tipo anteriores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El objeto de este artículo es el de dar a conocer, en la medida de lo posible, los resultados derivados de la revisión y nuevo estudio tipológico de los materiales líticos de la industria de los niveles solutrenses de la cueva del Parpalló. No es un dilema de fácil resolución el que se le presenta a un estudioso de la Prehistoria, y más concretamente del Paleolítico, a la hora de elegir su rama de estudio concreta; en nuestro caso particular abordamos la problemática de la tipología de la industria lítica paleolítica, que es en verdad tema extenso y trabajado por muchos investigadores. Las dos líneas hoy más en boga entre los paleolitistas son la propugnada por Mme. D. Sonneville-Bordes y J. Perrot (D. Sonneville-Bordes y J. Perrot, 1954-55-56) y por G. Laplace (Laplace, 1966, 1968, 1972 Y 1974 a) que no vamos a comentar aquí. La búsqueda de la mayor objetividad posible es la que, en nuestra opinión, debería decidir ya a los estudiosos del Paleolítico, sobre todo Superior, por el sistema tipológico analítico.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Embora a Gestão do Conhecimento (GC) seja função comum nas organizações, muitas não têm visão clara de como incorporá-la e transformá-la em vantagem competitiva. A escassez de estudos comprovando que a GC faz diferença no desempenho organizacional, e a cultura, talvez sejam os fatores mais influentes na promoção ou inibição de práticas de GC. Há empresas que usam ferramentas de Tecnologia da Informação (TI) como fator de competitividade, confundindo-as com GC. Outras acreditam que a TI sozinha possa servir para gerenciar o conhecimento, o que é um equívoco. A razão disso pode estar no surgimento da TI antes da GC, ou na escassez da literatura abordando a função da TI na GC. Daí a falta de clara distinção entre TI e GC que vise à interação adequada entre ambas. O papel principal da TI é dar suporte à GC, ampliando o alcance e acelerando a velocidade de transferência do conhecimento. É identificar, desenvolver e implantar tecnologias que apóiem a comunicação empresarial, o compartilhamento e a gestão dos ativos de conhecimento. A TI desempenha papel de infra-estrutura, a GC envolve aspectos humanos e gerenciais. Este artigo discute a interação entre TI e GC como instrumentos de gestão estratégica e desempenho organizacional.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article describes a method for determining the polydispersity index Ip2=Mz/Mw of the molecular weight distribution (MWD) of linear polymeric materials from linear viscoelastic data. The method uses the Mellin transform of the relaxation modulus of a simple molecular rheological model. One of the main features of this technique is that it enables interesting MWD information to be obtained directly from dynamic shear experiments. It is not necessary to achieve the relaxation spectrum, so the ill-posed problem is avoided. Furthermore, a determinate shape of the continuous MWD does not have to be assumed in order to obtain the polydispersity index. The technique has been developed to deal with entangled linear polymers, whatever the form of the MWD is. The rheological information required to obtain the polydispersity index is the storage G′(ω) and loss G″(ω) moduli, extending from the terminal zone to the plateau region. The method provides a good agreement between the proposed theoretical approach and the experimental polydispersity indices of several linear polymers for a wide range of average molecular weights and polydispersity indices. It is also applicable to binary blends.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell-wall mechanical properties play a key role in the growth and the protection of plants. However, little is known about genuine wall mechanical properties and their growth-related dynamics at subcellular resolution and in living cells. Here, we used atomic force microscopy (AFM) stiffness tomography to explore stiffness distribution in the cell wall of suspension-cultured Arabidopsis thaliana as a model of primary, growing cell wall. For the first time that we know of, this new imaging technique was performed on living single cells of a higher plant, permitting monitoring of the stiffness distribution in cell-wall layers as a function of the depth and its evolution during the different growth phases. The mechanical measurements were correlated with changes in the composition of the cell wall, which were revealed by Fourier-transform infrared (FTIR) spectroscopy. In the beginning and end of cell growth, the average stiffness of the cell wall was low and the wall was mechanically homogenous, whereas in the exponential growth phase, the average wall stiffness increased, with increasing heterogeneity. In this phase, the difference between the superficial and deep wall stiffness was highest. FTIR spectra revealed a relative increase in the polysaccharide/lignin content.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An understanding of details of the interaction mechanisms of bacterial endotoxins (lipopolysaccharide, LPS) with the oxygen transport protein hemoglobin is still lacking, despite its high biological relevance. Here, a biophysical investigation into the endotoxin:hemoglobin interaction is presented which comprises the use of various rough mutant LPS as well as free lipid A; in addition to the complete hemoglobin molecule from fetal sheep extract, also the partial structure alpha-chain and the heme-free sample are studied. The investigations comprise the determination of the gel-to-liquid crystalline phase behaviour of the acyl chains of LPS, the ultrastructure (type of aggregate structure and morphology) of the endotoxins, and the incorporation of the hemoglobins into artificial immune cell membranes and into LPS. Our data suggest a model for the interaction between Hb and LPS in which hemoglobins do not react strongly with the hydrophilic or with the hydrophobic moiety of LPS, but with the complete endotoxin aggregate. Hb is able to incorporate into LPS with the longitudinal direction parallel to the lipid A double-layer. Although this does not lead to a strong disturbance of the LPS acyl chain packing, the change of the curvature leads to a slightly conical molecular shape with a change of the three-dimensional arrangement from unilamellar into cubic LPS aggregates. Our previous results show that cubic LPS structures exhibit strong endotoxic activity. The property of Hb on the physical state of LPS described here may explain the observation of an increase in LPS-mediating endotoxicity due to the action of Hb.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: The objective of this study was to compare costs data by diagnosis related group (DRG) between Belgium and Switzerland. Our hypotheses were that differences between countries can probably be explained by methodological differences in cost calculations, by differences in medical practices and by differences in cost structures within the two countries. METHODS: Classifications of DRG used in the two countries differ (AP-DRGs version 1.7 in Switzerland and APR-DRGs version 15.0 in Belgium). The first step of this study was to transform Belgian summaries into Swiss AP-DRGs. Belgian and Swiss data were calculated with a clinical costing methodology (full costing). Belgian and Swiss costs were converted into US$ PPP (purchasing power parity) in order to neutralize differences in purchasing power between countries. RESULTS: The results of this study showed higher costs in Switzerland despite standardization of cost data according to PPP. The difference is not explained by the case-mix index because this was similar for inliers between the two countries. The length of stay (LOS) was also quite similar for inliers between the two countries. The case-mix index was, however, higher for high outliers in Belgium, as reflected in a higher LOS for these patients. Higher costs in Switzerland are thus probably explained mainly by the higher number of agency staff by service in this country or because of differences in medical practices. CONCLUSIONS: It is possible to make international comparisons but only if there is standardization of the case-mix between countries and only if comparable accountancy methodologies are used. Harmonization of DRGs groups, nomenclature and accountancy is thus required.