963 resultados para COMPLEX STRUCTURE
Resumo:
The Austroalpine nappe systems in SE-Switzerland and N-Italy preserve remnants of the Adriatic rifted margin. Based on new maps and cross-sections, we suggest that the complex structure of the Campo, Grosina/Languard, and Bernina nappes is inherited largely from Jurassic rifting. We propose a classification of the Austroalpine domain into Upper, Middle and Lower Austroalpine nappes that is new because it is based primarily on the rift-related Jurassic structure and paleogeography of these nappes. Based on the Alpine structures and pre-Alpine, rift-related geometry of the Lower (Bernina) and Middle (Campo, Grosina/Languard) Austroalpine nappes, we restore these nappes to their original positions along the former margin, as a means of understanding the formation and emplacement of the nappes during initial reactivation of the Alpine Tethyan margin. The Campo and Grosina/Languard nappes can be interpreted as remnants of a former necking zone that comprised pre-rift upper and middle crust. These nappes were juxtaposed with the Mesozoic cover of the Bernina nappe during Jurassic rifting. We find evidence for low-angle detachment faults and extensional allochthons in the Bernina nappe similar to those previously described in the Err nappe and explain their role during subsequent reactivation. Our observations reveal a strong control of rift-related structures during the subsequent Alpine reactivation on all scales of the former distal margin. Two zones of intense deformation, referred to as the Albula-Zebru and Lunghin-Mortirolo movement zones, have been reactivated during Alpine deformation and cannot be described as simple monophase faults or shear zones. We propose a tectonic model for the Austroalpine nappe systems that link inherited, rift-related structures with present-day Alpine structures. In conclusion, we believe that apart from the direct regional implications, the results of this paper are of general interest in understanding the control of rift structures during reactivation of distal-rifted margins.
Resumo:
Recent experiments of H2 adsorption on Pd(111) [T. Mitsui et al., Nature (London) 422, 705 (2003)] have questioned the classical Langmuir picture of second order adsorption kinetics at high surface coverage requiring pairs of empty sites for the dissociative chemisorption. Experiments find that at least three empty sites are needed. Through density functional theory, we find that H2 dissociation is favored on ensembles of sites that involve a Pd atom with no direct interaction with adsorbed hydrogen. Such active sites are formed by aggregation of at least 3 H-free sites revealing the complex structure of the "active sites."
Resumo:
We investigated the effect of benthic substratum type (sand and rocks) and nutrient supply (N and P) on biofilm structure and heterotrophic metabolism in a field experiment in a forested Mediterranean stream (Fuirosos). Rock and sand colonization and biofilm formation was intensively studied for 44 d at two stream reaches: control and experimental (continuous addition of phosphate, ammonia, and nitrate). Structural (C, N, and polysaccharide content and bacterial and chlorophyll density) and metabolic biofilm parameters (b-glucosidase, peptidase, and phosphatase enzyme activities) were analyzed throughout the colonization process. The epilithic biofilm (grown on rocks) had a higher peptidase activity at the impacted reach, together with a higher algal and bacterial biomass. The positive relationship between the peptidase activity per cell and the N content of the epilithic biofilm suggested that heterotrophic utilization of proteinaceous compounds from within the biofilm was occurring. In contrast, nutrient addition caused the epipsammic biofilm (grown on sand) to exhibit lower b-glucosidase and phosphatase activities, without a significant increase in bacterial and algal biomass. The differential response to nutrient addition was related to different structural characteristics within each biofilm. The epipsammic biofilm had a constant and high C:N ratio (22.7) throughout the colonization. The epilithic biofilm had a higher C:N ratio at the beginning of the colonization (43.2) and evolved toward a more complex structure (high polysaccharide content and low C:N ratio) during later stages. The epipsammic biofilm was a site for the accumulation and degradation of organic matter: polysaccharides and organic phosphorus compounds had higher degradation activities
Resumo:
During a possible loss of coolant accident in BWRs, a large amount of steam will be released from the reactor pressure vessel to the suppression pool. Steam will be condensed into the suppression pool causing dynamic and structural loads to the pool. The formation and break up of bubbles can be measured by visual observation using a suitable pattern recognition algorithm. The aim of this study was to improve the preliminary pattern recognition algorithm, developed by Vesa Tanskanen in his doctoral dissertation, by using MATLAB. Video material from the PPOOLEX test facility, recorded during thermal stratification and mixing experiments, was used as a reference in the development of the algorithm. The developed algorithm consists of two parts: the pattern recognition of the bubbles and the analysis of recognized bubble images. The bubble recognition works well, but some errors will appear due to the complex structure of the pool. The results of the image analysis were reasonable. The volume and the surface area of the bubbles were not evaluated. Chugging frequencies calculated by using FFT fitted well into the results of oscillation frequencies measured in the experiments. The pattern recognition algorithm works in the conditions it is designed for. If the measurement configuration will be changed, some modifications have to be done. Numerous improvements are proposed for the future 3D equipment.
Resumo:
The focus of the work reported in this thesis was to study and to clarify the effect of polyelectrolyte multilayer surface treatment on inkjet ink spreading, absorption and print quality. Surface sizing with a size press, film press with a pilot scale coater, and spray coating, have been used to surface treat uncoated wood-free, experimental wood-free and pigmentcoated substrates. The role of the deposited cationic (polydiallydimethylammonium chloride, PDADMAC) and anionic (sodium carboxymethyl cellulose, NaCMC) polyelectrolyte layers with and without nanosilica, on liquid absorption and spreading was studied in terms of their interaction with water-based pigmented and dye-based inkjet inks. Contact angle measurements were made in attempt to explain the ink spreading and wetting behavior on the substrate. First, it was noticed that multilayer surface treatment decreased the contact angle of water, giving a hydrophilic character to the surface. The results showed that the number of cationic-anionic polyelectrolyte layers or the order of deposition of the polyelectrolytes had a significant effect on the print quality. This was seen for example as a higher print density on layers with a cationic polyelectrolyte in the outermost layer. The number of layers had an influence on the print quality; the print density increased with increasing number of layers, although the increase was strongly dependent on ink formulation and chemistry. The use of nanosilica clearly affected the rate of absorption of polar liquids, which also was seen as a higher density of the black dye-based print. Slightly unexpected, the use of nanosilica increased the tendency for lateral spreading of both the pigmented and dye-based inks. It was shown that the wetting behavior and wicking of the inks on the polyelectrolyte coatings was strongly affected by the hydrophobicity of the substrate, as well as by the composition or structure of the polyelectrolyte layers. Coating only with a cationic polyelectrolyte was not sufficient to improve dye fixation, but it was demonstrated that a cationic-anionic-complex structure led to good water fastness. A threelayered structure gave the same water fastness values as a five-layered structure. Interestingly, the water fastness values were strongly dependent not only on the formed cation-anion polyelectrolyte complexes but also on the tendency of the coating to dissolve during immersion in water. Results showed that by optimizing the chemistry of the layers, the ink-substrate interaction can be optimized.
Resumo:
The main purpose of this paper is to amplify the current theoretical scenario of "Mental Health and Work" area, according to the Henri Bergson's philosophy and his concepts of perception, cognition, duration, psychic life, time and subjectivity. This theoretical-philosophical article aims to shed new light on the relations between philosophy of mind and present-day efforts toward a scientific theory of cognition, with its complex structure of theories, hypotheses and disciplines. There is in this paper a new approach to understand the contemporary cognitive sciences in a kind of phenomenological investigation initiated by Husserl's phenomenology. The methods employed were the systematic review and adaptation of Bergson's concepts, and its naturalization in the actual context of epistemological and ontological principles of cognitive sciences, to phenomenological analysis of "work-mental health" links. The current contributions of the Husserl's Phenomenology were used to understand the relations between mental health and work. There are also references to philosophy applied in contemporary cognitive sciences based on Bergson's theoretic-philosophical proposal.
Resumo:
L'éclatement est une transformation jouant un rôle important en géométrie, car il permet de résoudre des singularités, de relier des variétés birationnellement équivalentes, et de construire des variétés possédant des propriétés inédites. Ce mémoire présente d'abord l'éclatement tel que développé en géométrie algébrique classique. Nous l'étudierons pour le cas des variétés affines et (quasi-)projectives, en un point, et le long d'un idéal et d'une sous-variété. Nous poursuivrons en étudiant l'extension de cette construction à la catégorie différentiable, sur les corps réels et complexes, en un point et le long d'une sous-variété. Nous conclurons cette section en explorant un exemple de résolution de singularité. Ensuite nous passerons à la catégorie symplectique, où nous ferons la même chose que pour le cas différentiable complexe, en portant une attention particulière à la forme symplectique définie sur la variété. Nous terminerons en étudiant un théorème dû à François Lalonde, où l'éclatement joue un rôle clé dans la démonstration. Ce théorème affirme que toute 4-variété fibrée par des 2-sphères sur une surface de Riemann, et différente du produit cartésien de deux 2-sphères, peut être équipée d'une 2-forme qui lui confère une structure symplectique réglée par des courbes holomorphes par rapport à sa structure presque complexe, et telle que l'aire symplectique de la base est inférieure à la capacité de la variété. La preuve repose sur l'utilisation de l'éclatement symplectique. En effet, en éclatant symplectiquement une boule contenue dans la 4-variété, il est possible d'obtenir une fibration contenant deux sphères d'auto-intersection -1 distinctes: la pré-image du point où est fait l'éclatement complexe usuel, et la transformation propre de la fibre. Ces dernières sont dites exceptionnelles, et donc il est possible de procéder à l'inverse de l'éclatement - la contraction - sur chacune d'elles. En l'accomplissant sur la deuxième, nous obtenons une variété minimale, et en combinant les informations sur les aires symplectiques de ses classes d'homologies et de celles de la variété originale nous obtenons le résultat.
Resumo:
On révise les prérequis de géométrie différentielle nécessaires à une première approche de la théorie de la quantification géométrique, c'est-à-dire des notions de base en géométrie symplectique, des notions de groupes et d'algèbres de Lie, d'action d'un groupe de Lie, de G-fibré principal, de connexion, de fibré associé et de structure presque-complexe. Ceci mène à une étude plus approfondie des fibrés en droites hermitiens, dont une condition d'existence de fibré préquantique sur une variété symplectique. Avec ces outils en main, nous commençons ensuite l'étude de la quantification géométrique, étape par étape. Nous introduisons la théorie de la préquantification, i.e. la construction des opérateurs associés à des observables classiques et la construction d'un espace de Hilbert. Des problèmes majeurs font surface lors de l'application concrète de la préquantification : les opérateurs ne sont pas ceux attendus par la première quantification et l'espace de Hilbert formé est trop gros. Une première correction, la polarisation, élimine quelques problèmes, mais limite grandement l'ensemble des observables classiques que l'on peut quantifier. Ce mémoire n'est pas un survol complet de la quantification géométrique, et cela n'est pas son but. Il ne couvre ni la correction métaplectique, ni le noyau BKS. Il est un à-côté de lecture pour ceux qui s'introduisent à la quantification géométrique. D'une part, il introduit des concepts de géométrie différentielle pris pour acquis dans (Woodhouse [21]) et (Sniatycki [18]), i.e. G-fibrés principaux et fibrés associés. Enfin, il rajoute des détails à quelques preuves rapides données dans ces deux dernières références.
Resumo:
This thesis is concerned with the interaction between literature and abstract thought. More specifically, it studies the epistemological charge of the literary, the type of knowledge that is carried by elements proper to fictional narratives into different disciplines. By concentrating on two different theoretical methods, the creation of thought experiments and the framing of possible worlds, methods which were elaborated and are still used today in spheres as varied as modal logics, analytic philosophy and physics, and by following their reinsertion within literary theory, the research develops the theory that both thought experiments and possible worlds are in fact short narrative stories that inform knowledge through literary means. By using two novels, Abbott’s Flatland and Vonnegut’s The Sirens of Titan, that describe extra-dimensional existence in radically different ways, respectively as a phenomenologically unknowable space and as an outward perspective on time, it becomes clear that literature is constitutive of the way in which worlds, fictive, real or otherwise, are constructed and understood. Thus dimensions, established through extensional analogies as either experimental knowledge or modal possibility for a given world, generate new directions for thought, which can then take part in the inductive/deductive process of scientia. By contrasting the dimensions of narrative with the way that dimensions were historically constituted, the research also establishes that the literary opens up an infinite potential of abstract space-time domains, defined by their specific rules and limits, and that these different experimental folds are themselves partaking in a dimensional process responsible for new forms of understanding. Over against science fiction literary theories of speculation that posit an equation between the fictive and the real, this thesis examines the complex structure of many overlapping possibilities that can organise themselves around larger compossible wholes, thus offering a theory of reading that is both non-mimetic and non-causal. It consequently examines the a dynamic process whereby literature is always reconceived through possibilities actualised by reading while never defining how the reader will ultimately understand the overarching structure. In this context, the thesis argues that a causal story can be construed out of any one interaction with a given narrative—underscoring, for example, the divinatory strength of a particular vision of the future—even as this narrative represents only a fraction of the potential knowledge of any particular literary text. Ultimately, the study concludes by tracing out how novel comprehensions of the literary, framed by the material conditions of their own space and time, endlessly renew themselves through multiple interactions, generating analogies and speculations that facilitate the creation of new knowledge.
Resumo:
La chromatine possède une plasticité complexe et essentielle pour répondre à différents mécanismes cellulaires fondamentaux tels la réplication, la transcription et la réparation de l’ADN. Les histones sont les constituants essentiels de la formation des nucléosomes qui assurent le bon fonctionnement cellulaire d’où l’intérêt de cette thèse d’y porter une attention particulière. Un dysfonctionnement de la chromatine est souvent associé à l’émergence du cancer. Le chapitre II de cette thèse focalise sur la répression transcriptionnelle des gènes d’histones par le complexe HIR (HIstone gene Repressor) en réponse au dommage à l'ADN chez Saccharomyces cerevisiae. Lors de dommage à l’ADN en début de phase S, les kinases du point de contrôle Mec1, Tel1 et Rad53 s’assurent de bloquer les origines tardives de réplication pour limiter le nombre de collisions potentiellement mutagéniques ou cytotoxiques entre les ADN polymérases et les lésions persistantes dans l'ADN. Lorsque la synthèse totale d’ADN est soudainement ralentie par le point de contrôle, l’accumulation d'un excès d'histones nouvellement synthétisées est néfaste pour les cellules car les histones libres se lient de manière non-spécifique aux acides nucléiques. L'un des mécanismes mis en place afin de minimiser la quantité d’histones libres consiste à réprimer la transcription des gènes d'histones lors d'une chute rapide de la synthèse d'ADN, mais les bases moléculaires de ce mécanisme étaient très mal connues. Notre étude sur la répression des gènes d’histones en réponse aux agents génotoxiques nous a permis d’identifier que les kinases du point de contrôle jouent un rôle dans la répression des gènes d’histones. Avant le début de mon projet, il était déjà connu que le complexe HIR est requis pour la répression des gènes d’histones en phase G1, G2/M et lors de dommage à l’ADN en phase S. Par contre, la régulation du complexe HIR en réponse au dommage à l'ADN n'était pas connue. Nous avons démontré par des essais de spectrométrie de masse (SM) que Rad53 régule le complexe HIR en phosphorylant directement une de ses sous-unités, Hpc2, à de multiples résidus in vivo et in vitro. La phosphorylation d’Hpc2 est essentielle pour le recrutement aux promoteurs de gènes d’histones du complexe RSC (Remodels the Structure of Chromatin) dont la présence sur les promoteurs des gènes d'histones corrèle avec leur répression. De plus, nous avons mis à jour un nouveau mécanisme de régulation du complexe HIR durant la progression normale à travers le cycle cellulaire ainsi qu'en réponse aux agents génotoxiques. En effet, durant le cycle cellulaire normal, la protéine Hpc2 est très instable durant la transition G1/S afin de permettre la transcription des gènes d’histones et la production d'un pool d'histones néo-synthétisées juste avant l'initiation de la réplication de l’ADN. Toutefois, Hpc2 n'est instable que pour une brève période de temps durant la phase S. Ces résultats suggèrent qu'Hpc2 est une protéine clef pour la régulation de l'activité du complexe HIR et la répression des gènes d’histones lors du cycle cellulaire normal ainsi qu'en réponse au dommage à l’ADN. Dans le but de poursuivre notre étude sur la régulation des histones, le chapitre III de ma thèse concerne l’analyse globale de l’acétylation des histones induite par les inhibiteurs d’histone désacétylases (HDACi) dans les cellules normales et cancéreuses. Les histones désacétylases (HDACs) sont les enzymes qui enlèvent l’acétylation sur les lysines des histones. Dans plusieurs types de cancers, les HDACs contribuent à l’oncogenèse par leur fusion aberrante avec des complexes protéiques oncogéniques. Les perturbations causées mènent souvent à un état silencieux anormal des suppresseurs de tumeurs. Les HDACs sont donc une cible de choix dans le traitement des cancers engendrés par ces protéines de fusion. Notre étude de l’effet sur l’acétylation des histones de deux inhibiteurs d'HDACs de relevance clinique, le vorinostat (SAHA) et l’entinostat (MS-275), a permis de démontrer une augmentation élevée de l’acétylation globale des histones H3 et H4, contrairement à H2A et H2B, et ce, autant chez les cellules normales que cancéreuses. Notre quantification en SM de l'acétylation des histones a révélé de façon inattendue que la stœchiométrie d'acétylation sur la lysine 56 de l’histone H3 (H3K56Ac) est de seulement 0,03% et, de manière surprenante, cette stœchiométrie n'augmente pas dans des cellules traitées avec différents HDACi. Plusieurs études de H3K56Ac chez l’humain présentes dans la littérature ont rapporté des résultats irréconciliables. Qui plus est, H3K56Ac était considéré comme un biomarqueur potentiel dans le diagnostic et pronostic de plusieurs types de cancers. C’est pourquoi nous avons porté notre attention sur la spécificité des anticorps utilisés et avons déterminé qu’une grande majorité d’anticorps utilisés dans la littérature reconnaissent d’autres sites d'acétylation de l’histone H3, notamment H3K9Ac dont la stœchiométrie d'acétylation in vivo est beaucoup plus élevée que celle d'H3K56Ac. De plus, le chapitre IV fait suite à notre étude sur l’acétylation des histones et consiste en un rapport spécial de recherche décrivant la fonction de H3K56Ac chez la levure et l’homme et comporte également une évaluation d’un anticorps supposément spécifique d'H3K56Ac en tant qu'outil diagnostic du cancer chez l’humain.
Resumo:
The brain with its highly complex structure made up of simple units,imterconnected information pathways and specialized functions has always been an object of mystery and sceintific fascination for physiologists,neuroscientists and lately to mathematicians and physicists. The stream of biophysicists are engaged in building the bridge between the biological and physical sciences guided by a conviction that natural scenarios that appear extraordinarily complex may be tackled by application of principles from the realm of physical sciences. In a similar vein, this report aims to describe how nerve cells execute transmission of signals ,how these are put together and how out of this integration higher functions emerge and get reflected in the electrical signals that are produced in the brain.Viewing the E E G Signal through the looking glass of nonlinear theory, the dynamics of the underlying complex system-the brain ,is inferred and significant implications of the findings are explored.
Resumo:
A study of the magneto-optical (MO) spectral response of Co nanoparticles embedded in MgO as a function of their size and concentration in the spectral range from 1.4 to 4.3 eV is presented. The nanoparticle layers were obtained by sputtering at different deposition temperatures. Transmission electron microscopy measurements show that the nanoparticles have a complex structure which consists of a crystalline core having a hexagonal close-packed structure and an amorphous crust. Using an effective-medium approximation we have obtained the MO constants of the Co nanoparticles. These MO constants are different from those of continuous Co layers and depend on the size of the crystalline core. We associate these changes with the size effect of the intraband contribution to the MO constants, related to a reduction of the relaxation time of the electrons into the nanoparticles.
Resumo:
Science is search for the laws of underlying phenomena of the nature. Engineering constructs the nature as we wish. Interestingly the huge engineering infrastructure like world wide web has grown in such a complex structure such that we need to see the fundamental science behind the structure and behaviour of these networks. This talk covers the science behind the complex networks like web, biological, social etc. The talk aim to discuss the basic theories that govern the static as well as the dynamics of such interesting networks
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
Augerelectron emission from foil-excited Ne-ions (6 to 10 MeV beam energy) has been measured. The beam-foil time-of-flight technique has been applied to study electronic transitions of metastable states (delayed spectra) and to determine their lifetimes. To achieve a line identification for the complex structure observed in the prompt spectrum, the spectrum is separated into its isoelectronic parts by an Augerelectron-ion coincidence correlating the emitted electrons and the emitting projectiles of well defined final charge states q_f. Well resolved spectra were obtained and the lines could be identified using intermediate coupling Dirac-Fock multiconfiguration calculations. From the total KLL-Augerelectron transition probabilities observed in the electronion coincidence experiment for Ne (10 MeV) the amount of projectiles with one K-hole just behind a C-target can be estimated. For foil-excited Ne-projectiles in contrast to single collision results the comparison of transition intensities for individual lines with calculated transition probabilities yields a statistical population of Li- and Be-like configurations.