955 resultados para HILBERT TRANSFORM
Resumo:
RESUME L'étude de la médecine à la Faculté de l'Université de Lausanne est un cursus de six ans. Depuis la réforme générale du curriculum en octobre 1995, le programme de la deuxième année consacrée à l'étude de l'être humain sain a été transformé. L'enseignement intégré par système ou organe a été introduit en remplaçant l'enseignement par discipline. Parallèlement, un système d'évaluation de l'enseignement par les étudiants a été proposé. Il a été amélioré au fil des années et depuis l'année académique 1998-99, l'évaluation est devenue systémique et régulière. Notre étude présente et compare les résultats des évaluations de l'enseignement et des enseignants de neuf cours intégrés dispensés en deuxième année durant deux années académiques (1998-99 et 1999-2000). Une forte corrélation entre les résultats des deux années consécutives ainsi qu'une importante disparité des estimations à l'intérieure de chacune de deux années ont été observées. Ceci démontre un engagement sérieux des étudiants dans le processus d'évaluation, révèle la pertinence de leur analyse et leur bonne capacité de discernement. L'analyse de nos résultats montre que les évaluations effectuées par les étudiants peuvent constituer une source fiable d'informations et contribuer à l'amélioration du processus d'enseignement.
Resumo:
Iowa’s infrastructure is at a crossroads. A stalwart collection of Iowans dared to consider Iowa’s future economy, the way ahead for future generations, and what infrastructure will be required – and what will not be required – for Iowa to excel. The findings are full of opportunity and challenge. The Infrastructure Plan for Iowa’s Future Economy: A Strategic Direction tells the story and points the way to a strong economy and quality of life for our children and our children’s children. This plan is different from most in that the motivation for its development came not from a requirement to comply or achieve a particular milestone, but, rather, from a recognition that infrastructure, in order to ensure a globally-competitive future economy, must transform from that of past generations. It is not news that all infrastructure – from our rich soil to our bridges – is a challenge to maintain. Prior to the natural disasters of 2008 and the national economic crisis, Iowa was tested in its capacity to sustain not only the infrastructure, but to anticipate future needs. It is imperative that wise investments and planning guide Iowa’s infrastructure development. This plan reflects Iowa’s collective assessment of its infrastructure– buildings, energy, natural resources, telecommunications, and transportation – as, literally, interdependent building blocks of our future. Over the months of planning, more than 200 Iowans participated as part of committees, a task force, or in community meetings. The plan is for all of Iowa, reflected in private, nonprofit, and public interests and involvement throughout the process. Iowa’s success depends on all of Iowa, in all sectors and interests, to engage in its implementation. The Infrastructure Plan for Iowa’s Future Economy: A Strategic Direction sets a clear and bold direction for all stakeholders, making it clear all have a responsibility and an opportunity to contribute to Iowa’s success.
Resumo:
Nanotechnology has been heralded as a "revolution" in science, for two reasons: first, because of its revolutionary view of the way in which chemicals and elements, such as gold and silver, behave, compared to traditional scientific understanding of their properties. Second, the impact of these new discoveries, as applied to commerce, can transform the daily life of consumer products ranging from sun tan lotions and cosmetics, food packaging and paints and coatings for cars, housing and fabrics, medicine and thousands of industrial processes.9 Beneficial consumer use of nanotechnologies, already in the stream of commerce, improves coatings on inks and paints in everything from food packaging to cars. Additionally, "Nanomedicine" offers the promise of diagnosis and treatment at the molecular level in order to detect and treat presymptomatic disease,10 or to rebuild neurons in Alzheimer's and Parkinson's disease. There is a possibility that severe complications such as stroke or heart attack may be avoided by means of prophylactic treatment of people at risk, and bone regeneration may keep many people active who never expected rehabilitation. Miniaturisation of diagnostic equipment can also reduce the amount of sampling materials required for testing and medical surveillance. Miraculous developments, that sound like science fiction to those people who eagerly anticipate these medical products, combined with the emerging commercial impact of nanotechnology applications to consumer products will reshape civil society - permanently. Thus, everyone within the jurisdiction of the Council of Europe is an end-user of nanotechnology, even without realising that nanotechnology has touched daily life.
Resumo:
In this paper we find the quantities that are adiabatic invariants of any desired order for a general slowly time-dependent Hamiltonian. In a preceding paper, we chose a quantity that was initially an adiabatic invariant to first order, and sought the conditions to be imposed upon the Hamiltonian so that the quantum mechanical adiabatic theorem would be valid to mth order. [We found that this occurs when the first (m - 1) time derivatives of the Hamiltonian at the initial and final time instants are equal to zero.] Here we look for a quantity that is an adiabatic invariant to mth order for any Hamiltonian that changes slowly in time, and that does not fulfill any special condition (its first time derivatives are not zero initially and finally).
Resumo:
A model of anisotropic fluid with three perfect fluid components in interaction is studied. Each fluid component obeys the stiff matter equation of state and is irrotational. The interaction is chosen to reproduce an integrable system of equations similar to the one associated to self-dual SU(2) gauge fields. An extension of the BelinskyZakharov version of the inverse scattering transform is presented and used to find soliton solutions to the coupled Einstein equations. A particular class of solutions that can be interpreted as lumps of matter propagating in empty space-time is examined.
Resumo:
ABSTRACT Increasing attention has been given, over the past decades, to the production of exopolysaccharides (EPS) from rhizobia, due to their various biotechnological applications. Overall characterization of biopolymers involves evaluation of their chemical, physical, and biological properties; this evaluation is a key factor in understanding their behavior in different environments, which enables researchers to foresee their potential applications. Our focus was to study the EPS produced by Mesorhizobium huakuii LMG14107, M. loti LMG6125, M. plurifarium LMG11892,Rhizobium giardini bv. giardiniH152T, R. mongolense LMG19141, andSinorhizobium (= Ensifer)kostiense LMG19227 in a RDM medium with glycerol as a carbon source. These biopolymers were isolated and characterized by reversed-phase high-performance liquid chromatography (RP-HPLC), Fourier transform infrared (FTIR), and nuclear magnetic resonance (NMR) spectroscopies. Maximum exopolysaccharide production was 3.10, 2.72, and 2.50 g L-1for the strains LMG6125, LMG19227, and LMG19141, respectively. The purified EPS revealed prominent functional reactive groups, such as hydroxyl and carboxylic, which correspond to a typical heteropolysaccharide. The EPS are composed primarily of galactose and glucose. Minor components found were rhamnose, glucuronic acid, and galacturonic acid. Indeed, from the results of techniques applied in this study, it can be noted that the EPS are species-specific heteropolysaccharide polymers composed of common sugars that are substituted by non-carbohydrate moieties. In addition, analysis of these results indicates that rhizobial EPS can be classified into five groups based on ester type, as determined from the 13C NMR spectra. Knowledge of the EPS composition now facilitates further investigations relating polysaccharide structure and dynamics to rheological properties.
Resumo:
In this paper we analyze the time of ruin in a risk process with the interclaim times being Erlang(n) distributed and a constant dividend barrier. We obtain an integro-differential equation for the Laplace Transform of the time of ruin. Explicit solutions for the moments of the time of ruin are presented when the individual claim amounts have a distribution with rational Laplace transform. Finally, some numerical results and a compare son with the classical risk model, with interclaim times following an exponential distribution, are given.
Resumo:
Manuscrit orné de dessins à la plume et aux armes de François de Rochechouart, gouverneur du Gênes sous Louis XII. Anciennes collections Mc Carthy, Georges Hilbert, Robert Hoe et comte Paul Durrieu.
Resumo:
This paper introduces a nonlinear measure of dependence between random variables in the context of remote sensing data analysis. The Hilbert-Schmidt Independence Criterion (HSIC) is a kernel method for evaluating statistical dependence. HSIC is based on computing the Hilbert-Schmidt norm of the cross-covariance operator of mapped samples in the corresponding Hilbert spaces. The HSIC empirical estimator is very easy to compute and has good theoretical and practical properties. We exploit the capabilities of HSIC to explain nonlinear dependences in two remote sensing problems: temperature estimation and chlorophyll concentration prediction from spectra. Results show that, when the relationship between random variables is nonlinear or when few data are available, the HSIC criterion outperforms other standard methods, such as the linear correlation or mutual information.
Resumo:
Résumé : La microautophagie du noyau est un processus découvert chez la levure S. cerevisiae qui vise la dégradation de portions nucléaires dans la lumière vacuolaire. Ce processus appelé PMN (de l'anglais Piecemeal Microautophagy of the Nucleus) est induit dans des conditions de stress cellulaire comme la privation de nutriments, mais également par l'utilisation d'une drogue : la rapamycine. La PMN est due à l'interaction directe d'une protéine de la membrane externe de l'enveloppe nucléaire Nvj1p, et d'une protéine de la membrane vacuolaire Vac8p. L'interaction de ces deux protéines forme la jonction noyau-vacuole. Cette jonction guide la formation d'une invagination, qui englobe et étire vers la lumière vacuolaire une partie du noyau sous la forme d'un sac. Il s'en suit la libération d'une vésicule dégradée par les hydrolases. Les mécanismes moléculaires intervenant à différentes étapes de ce processus sont inconnus. Le but de ma thèse est de mettre en évidence de nouveaux acteurs qui interviennent dans la PMN. Dans la première partie de cette étude, nous présentons une procédure de sélection à la recherche de candidats jouant un rôle dans la PMN. Cette sélection a été effectuée dans la collection de mutants commercialisée chez Euroscarf. La procédure reposait sur l'observation que le nucléole (représenté par Nop1p) est le substrat préférentiel de la PMN dans des expériences de microscopie faites après induction de la PMN avec la rapamycine. Nous avons ainsi transformé la collection de mutants avec un plasmide portant le marqueur du nucléole Noplp. Par la suite, nous avons cherché par microscopie les mutants incapables de transférer Nop1p du noyau à la vacuole. Nous avons trouvé 318 gènes présentant un défaut de transfert de Nop1p par PMN. Ces gènes ont été classés par grandes familles fonctionnelles et aussi par leur degré de défaut de PMN. Egalement dans cette partie de l'étude, nous avons décrit des mutants impliqués dans le processus, à des étapes différentes. Dans la seconde partie de l'étude, nous avons regardé l'implication et le rôle de la V-ATPase, (une pompe à protons de la membrane vacuolaire}, sélectionnée parmi les candidats, dans le processus de PMN. Les inhibiteurs de ce complexe, comme la concanamycineA, bloquent l'activité PMN et semblent affecter le processus à deux étapes différentes. D'un autre côté, les jonctions «noyau-vacuole »forment une barrière de diffusion au niveau de la membrane vacuolaire, de laquelle Vphlp, une protéine de la V-ATPase, est exclue.
Resumo:
During my PhD, my aim was to provide new tools to increase our capacity to analyse gene expression patterns, and to study on a large-scale basis the evolution of gene expression in animals. Gene expression patterns (when and where a gene is expressed) are a key feature in understanding gene function, notably in development. It appears clear now that the evolution of developmental processes and of phenotypes is shaped both by evolution at the coding sequence level, and at the gene expression level.Studying gene expression evolution in animals, with complex expression patterns over tissues and developmental time, is still challenging. No tools are available to routinely compare expression patterns between different species, with precision, and on a large-scale basis. Studies on gene expression evolution are therefore performed only on small genes datasets, or using imprecise descriptions of expression patterns.The aim of my PhD was thus to develop and use novel bioinformatics resources, to study the evolution of gene expression. To this end, I developed the database Bgee (Base for Gene Expression Evolution). The approach of Bgee is to transform heterogeneous expression data (ESTs, microarrays, and in-situ hybridizations) into present/absent calls, and to annotate them to standard representations of anatomy and development of different species (anatomical ontologies). An extensive mapping between anatomies of species is then developed based on hypothesis of homology. These precise annotations to anatomies, and this extensive mapping between species, are the major assets of Bgee, and have required the involvement of many co-workers over the years. My main personal contribution is the development and the management of both the Bgee database and the web-application.Bgee is now on its ninth release, and includes an important gene expression dataset for 5 species (human, mouse, drosophila, zebrafish, Xenopus), with the most data from mouse, human and zebrafish. Using these three species, I have conducted an analysis of gene expression evolution after duplication in vertebrates.Gene duplication is thought to be a major source of novelty in evolution, and to participate to speciation. It has been suggested that the evolution of gene expression patterns might participate in the retention of duplicate genes. I performed a large-scale comparison of expression patterns of hundreds of duplicated genes to their singleton ortholog in an outgroup, including both small and large-scale duplicates, in three vertebrate species (human, mouse and zebrafish), and using highly accurate descriptions of expression patterns. My results showed unexpectedly high rates of de novo acquisition of expression domains after duplication (neofunctionalization), at least as high or higher than rates of partitioning of expression domains (subfunctionalization). I found differences in the evolution of expression of small- and large-scale duplicates, with small-scale duplicates more prone to neofunctionalization. Duplicates with neofunctionalization seemed to evolve under more relaxed selective pressure on the coding sequence. Finally, even with abundant and precise expression data, the majority fate I recovered was neither neo- nor subfunctionalization of expression domains, suggesting a major role for other mechanisms in duplicate gene retention.
Resumo:
The identity [r]evolution is happening. Who are you, who am I in the information society? In recent years, the convergence of several factors - technological, political, economic - has accelerated a fundamental change in our networked world. On a technological level, information becomes easier to gather, to store, to exchange and to process. The belief that more information brings more security has been a strong political driver to promote information gathering since September 11. Profiling intends to transform information into knowledge in order to anticipate one's behaviour, or needs, or preferences. It can lead to categorizations according to some specific risk criteria, for example, or to direct and personalized marketing. As a consequence, new forms of identities appear. They are not necessarily related to our names anymore. They are based on information, on traces that we leave when we act or interact, when we go somewhere or just stay in one place, or even sometimes when we make a choice. They are related to the SIM cards of our mobile phones, to our credit card numbers, to the pseudonyms that we use on the Internet, to our email addresses, to the IP addresses of our computers, to our profiles... Like traditional identities, these new forms of identities can allow us to distinguish an individual within a group of people, or describe this person as belonging to a community or a category. How far have we moved through this process? The identity [r]evolution is already becoming part of our daily lives. People are eager to share information with their "friends" in social networks like Facebook, in chat rooms, or in Second Life. Customers take advantage of the numerous bonus cards that are made available. Video surveillance is becoming the rule. In several countries, traditional ID documents are being replaced by biometric passports with RFID technologies. This raises several privacy issues and might actually even result in changing the perception of the concept of privacy itself, in particular by the younger generation. In the information society, our (partial) identities become the illusory masks that we choose -or that we are assigned- to interplay and communicate with each other. Rights, obligations, responsibilities, even reputation are increasingly associated with these masks. On the one hand, these masks become the key to access restricted information and to use services. On the other hand, in case of a fraud or negative reputation, the owner of such a mask can be penalized: doors remain closed, access to services is denied. Hence the current preoccupying growth of impersonation, identity-theft and other identity-related crimes. Where is the path of the identity [r]evolution leading us? The booklet is giving a glance on possible scenarios in the field of identity.
Resumo:
The emergence of chirality in enantioselective autocatalysis for compounds unable to transform according to the Frank-like reaction network is discussed with respect to the controversial limited enantioselectivity (LES) model composed of coupled enantioselective and non-enantioselective autocatalyses. The LES model cannot lead to spontaneous mirror symmetry breaking (SMSB) either in closed systems with a homogeneous temperature distribution or in closed systems with a stationary non-uniform temperature distribution. However, simulations of chemical kinetics in a two-compartment model demonstrate that SMSB may occur if both autocatalytic reactions are spatially separated at different temperatures in different compartments but coupled under the action of a continuous internal flow. In such conditions, the system can evolve, for certain reaction and system parameters, toward a chiral stationary state; that is, the system is able to reach a bifurcation point leading to SMSB. Numerical simulations in which reasonable chemical parameters have been used suggest that an ade- quate scenario for such a SMSB would be that of abyssal hydrothermal vents, by virtue of the typical temper- ature gradients found there and the role of inorganic solids mediating chemical reactions in an enzyme-like role. Key Words: Homochirality Prebiotic chemistry.
Resumo:
Embora a Gestão do Conhecimento (GC) seja função comum nas organizações, muitas não têm visão clara de como incorporá-la e transformá-la em vantagem competitiva. A escassez de estudos comprovando que a GC faz diferença no desempenho organizacional, e a cultura, talvez sejam os fatores mais influentes na promoção ou inibição de práticas de GC. Há empresas que usam ferramentas de Tecnologia da Informação (TI) como fator de competitividade, confundindo-as com GC. Outras acreditam que a TI sozinha possa servir para gerenciar o conhecimento, o que é um equívoco. A razão disso pode estar no surgimento da TI antes da GC, ou na escassez da literatura abordando a função da TI na GC. Daí a falta de clara distinção entre TI e GC que vise à interação adequada entre ambas. O papel principal da TI é dar suporte à GC, ampliando o alcance e acelerando a velocidade de transferência do conhecimento. É identificar, desenvolver e implantar tecnologias que apóiem a comunicação empresarial, o compartilhamento e a gestão dos ativos de conhecimento. A TI desempenha papel de infra-estrutura, a GC envolve aspectos humanos e gerenciais. Este artigo discute a interação entre TI e GC como instrumentos de gestão estratégica e desempenho organizacional.
Resumo:
The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.