895 resultados para Graphically view
Resumo:
Some naturally occurring strains of fungi cease growing through successive subculturing, i.e., they senesce. In Neurospora, senescing strains usually contain intramitochondrial linear or circular plasmids. An entire plasmid or its part(s) integrates into the mtDNA, causing insertional mutagenesis. The functionally defective mitochondria replicate faster than the wild-type mitochondria and spread through interconnected hyphal cells. Senescence could also be due to spontaneous lethal nuclear gene mutations arising in the multinucleated mycelium. However, their phenotypic effects remain masked until the nuclei segregate into a homokaryotic spore, and the spore germinates to form a mycelium that is incapable of extended culturing. Ultimately the growth of a fungal colony ceases due to dysfunctional oxidative phosphorylation. Results with senescing nuclear mutants or growth-impaired cytoplasmic mutants suggest that mtDNA is inherently unstable, requiring protection by as yet unidentified nuclear-gene-encoded factors for normal functioning. Interestingly, these results are in accord with the endosymbiotic theory of origin of eukaryotic cells.
Resumo:
New stars in galaxies form in dense, molecular clouds of the interstellar medium. Measuring how the mass is distributed in these clouds is of crucial importance for the current theories of star formation. This is because several open issues in them, such as the strength of different mechanism regulating star formation and the origin of stellar masses, can be addressed using detailed information on the cloud structure. Unfortunately, quantifying the mass distribution in molecular clouds accurately over a wide spatial and dynamical range is a fundamental problem in the modern astrophysics. This thesis presents studies examining the structure of dense molecular clouds and the distribution of mass in them, with the emphasis on nearby clouds that are sites of low-mass star formation. In particular, this thesis concentrates on investigating the mass distributions using the near infrared dust extinction mapping technique. In this technique, the gas column densities towards molecular clouds are determined by examining radiation from the stars that shine through the clouds. In addition, the thesis examines the feasibility of using a similar technique to derive the masses of molecular clouds in nearby external galaxies. The papers presented in this thesis demonstrate how the near infrared dust extinction mapping technique can be used to extract detailed information on the mass distribution in nearby molecular clouds. Furthermore, such information is used to examine characteristics crucial for the star formation in the clouds. Regarding the use of extinction mapping technique in nearby galaxies, the papers of this thesis show that deriving the masses of molecular clouds using the technique suffers from strong biases. However, it is shown that some structural properties can still be examined with the technique.
Resumo:
Relay selection for cooperative communications promises significant performance improvements, and is, therefore, attracting considerable attention. While several criteria have been proposed for selecting one or more relays, distributed mechanisms that perform the selection have received relatively less attention. In this paper, we develop a novel, yet simple, asymptotic analysis of a splitting-based multiple access selection algorithm to find the single best relay. The analysis leads to simpler and alternate expressions for the average number of slots required to find the best user. By introducing a new contention load' parameter, the analysis shows that the parameter settings used in the existing literature can be improved upon. New and simple bounds are also derived. Furthermore, we propose a new algorithm that addresses the general problem of selecting the best Q >= 1 relays, and analyze and optimize it. Even for a large number of relays, the scalable algorithm selects the best two relays within 4.406 slots and the best three within 6.491 slots, on average. We also propose a new and simple scheme for the practically relevant case of discrete metrics. Altogether, our results develop a unifying perspective about the general problem of distributed selection in cooperative systems and several other multi-node systems.
Resumo:
In this short essay I offer some “business researcher” advice on how to leverage a strong background in psychology when attempting to contribute to the maturing field of “entrepreneurship research”. Psychologists can benefit from within-discipline research, e.g. on emergence, small groups, fit, and expertise as well as method strengths in, e.g. experimentation, operationalisation of constructs, and multi-level modelling. However, achieving full leverage of these strengths requires a clear conceptualisation of “entrepreneurship” as well as insights into the challenges posed by the nature of this class of phenomena.
Resumo:
Thermotropic liquid crystals are known to display rich phase behavior on temperature variation. Although the nematic phase is orientationally ordered but translationally disordered, a smectic phase is characterized by the appearance of a partial translational order in addition to a further increase in orientational order. In an attempt to understand the interplay between orientational and translational order in the mesophases that thermotropic liquid crystals typically exhibit upon cooling from the high-temperature isotropic phase, we investigate the potential energy landscapes of a family of model liquid crystalline systems. The configurations of the system corresponding to the local potential energy minima, known as the inherent structures, are determined from computer simulations across the mesophases. We find that the depth of the potential energy minima explored by the system along an isochor grows through the nematic phase as temperature drops in contrast to its insensitivity to temperature in the isotropic and smectic phases. The onset of the growth of the orientational order in the parent phase is found to induce a translational order, resulting in a smectic-like layer in the underlying inherent structures; the inherent structures, surprisingly, never seem to sustain orientational order alone if the parent nematic phase is sandwiched between the high-temperature isotropic phase and the low-temperature smectic phase. The Arrhenius temperature dependence of the orientational relaxation time breaks down near the isotropic-nematic transition. We find that this breakdown occurs at a temperature below which the system explores increasingly deeper potential energy minima.
Resumo:
The industry foundation classes (IFC) file format is one of the most complex and ambitious IT standardization projects currently being undertaken in any industry, focusing on the development of an open and neutral standard for exchanging building model data. Scientific literature related to the IFC standard has dominantly been technical so far; research looking at the IFC standard from an industry standardization per- spective could offer valuable new knowledge for both theory and practice. This paper proposes the use of IT standardization and IT adoption theories, supported by studies done within construction IT, to lay a theoretical foundation for further empirical analysis of the standardization process of the IFC file format.
Resumo:
There has been a demand for uniform CAD standards in the construction industry ever since the large-scale introduction of computer aided design systems in the late 1980s. While some standards have been widely adopted without much formal effort, other standards have failed to gain support even though considerable resources have been allocated for the purpose. Establishing a standard concerning building information modeling has been one particularly active area of industry development and scientific interest within recent years. In this paper, four different standards are discussed as cases: the IGES and DXF/DWG standards for representing the graphics in 2D drawings, the ISO 13567 standard for the structuring of building information on layers, and the IFC standard for building product models. Based on a literature study combined with two qualitative interview studies with domain experts, a process model is proposed to describe and interpret the contrasting histories of past CAD standardisation processes.
Resumo:
Aerosol particles deteriorate air quality, atmospheric visibility and our health. They affect the Earth s climate by absorbing and scattering sunlight, forming clouds, and also via several feed-back mechanisms. The net effect on the radiative balance is negative, i.e. cooling, which means that particles counteract the effect of greenhouse gases. However, particles are one of the poorly known pieces in the climate puzzle. Some of the airborne particles are natural, some anthropogenic; some enter the atmosphere in particle form, while others form by gas-to-particle conversion. Unless the sources and dynamical processes shaping the particle population are quantified, they cannot be incorporated into climate models. The molecular level understanding of new particle formation is still inadequate, mainly due to the lack of suitable measurement techniques to detect the smallest particles and their precursors. This thesis has contributed to our ability to measure newly formed particles. Three new condensation particle counter applications for measuring the concentration of nano-particles were developed. The suitability of the methods for detecting both charged and electrically neutral particles and molecular clusters as small as 1 nm in diameter was thoroughly tested both in laboratory and field conditions. It was shown that condensation particle counting has reached the size scale of individual molecules, and besides measuring the concentration they can be used for getting size information. In addition to atmospheric research, the particle counters could have various applications in other fields, especially in nanotechnology. Using the new instruments, the first continuous time series of neutral sub-3 nm particle concentrations were measured at two field sites, which represent two different kinds of environments: the boreal forest and the Atlantic coastline, both of which are known to be hot-spots for new particle formation. The contribution of ions to the total concentrations in this size range was estimated, and it could be concluded that the fraction of ions was usually minor, especially in boreal forest conditions. Since the ionization rate is connected to the amount of cosmic rays entering the atmosphere, the relative contribution of neutral to charged nucleation mechanisms extends beyond academic interest, and links the research directly to current climate debate.
Resumo:
Lactobacillus rhamnosus GG is a probiotic bacterium that is known worldwide. Since its discovery in 1985, the health effects and biology of this health-promoting strain have been researched at an increasing rate. However, knowledge of the molecular biology responsible for these health effects is limited, even though research in this area has continued to grow since the publication of the whole genome sequence of L. rhamnosus GG in 2009. In this thesis, the molecular biology of L. rhamnosus GG was explored by mapping the changes in protein levels in response to diverse stress factors and environmental conditions. The proteomics data were supplemented with transcriptome level mapping of gene expression. The harsh conditions of the gastro-intestinal tract, which involve acidic conditions and detergent-like bile acids, are a notable challenge to the survival of probiotic bacteria. To simulate these conditions, L. rhamnosus GG was exposed to a sudden bile stress, and several stress response mechanisms were revealed, among others various changes in the cell envelope properties. L. rhamnosus GG also responded in various ways to mild acid stress, which probiotic bacteria may face in dairy fermentations and product formulations. The acid stress response of L. rhamnosus GG included changes in central metabolism and specific responses related to the control of intracellular pH. Altogether, L. rhamnosus GG was shown to possess a large repertoire of mechanisms for responding to stress conditions, which is a beneficial character of a probiotic organism. Adaptation to different growth conditions was studied by comparing the proteome level responses of L. rhamnosus GG to divergent growth media and to different phases of growth. Comparing different growth phases revealed that the metabolism of L. rhamnosus GG is modified markedly during shift from the exponential to the stationary phase of growth. These changes were seen both at proteome and transcriptome levels and in various different cellular functions. When the growth of L. rhamnosus GG in a rich laboratory medium and in an industrial whey-based medium was compared, various differences in metabolism and in factors affecting the cell surface properties could be seen. These results led us to recommend that the industrial-type media should be used in laboratory studies of L. rhamnosus GG and other probiotic bacteria to achieve a similar physiological state for the bacteria as that found in industrial products, which would thus yield more relevant information about the bacteria. In addition, an interesting phenomenon of protein phosphorylation was observed in L. rhamnosus GG. Phosphorylation of several proteins of L. rhamnosus GG was detected, and there were hints that the degree of phosphorylation may be dependent on the growth pH.
Resumo:
Indian logic has a long history. It somewhat covers the domains of two of the six schools (darsanas) of Indian philosophy, namely, Nyaya and Vaisesika. The generally accepted definition of Indian logic over the ages is the science which ascertains valid knowledge either by means of six senses or by means of the five members of the syllogism. In other words, perception and inference constitute the subject matter of logic. The science of logic evolved in India through three ages: the ancient, the medieval and the modern, spanning almost thirty centuries. Advances in Computer Science, in particular, in Artificial Intelligence have got researchers in these areas interested in the basic problems of language, logic and cognition in the past three decades. In the 1980s, Artificial Intelligence has evolved into knowledge-based and intelligent system design, and the knowledge base and inference engine have become standard subsystems of an intelligent system. One of the important issues in the design of such systems is knowledge acquisition from humans who are experts in a branch of learning (such as medicine or law) and transferring that knowledge to a computing system. The second important issue in such systems is the validation of the knowledge base of the system i.e. ensuring that the knowledge is complete and consistent. It is in this context that comparative study of Indian logic with recent theories of logic, language and knowledge engineering will help the computer scientist understand the deeper implications of the terms and concepts he is currently using and attempting to develop.
Resumo:
The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space view-point is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces f(s) and f(g) and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating f(s) and f(g) is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication-complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. Extensions to the multi-party case is straightforward and is briefly discussed. The average case CC of the relevant greaterthan (CT) function is characterized within two bits. Under the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm. 2010 Elsevier B.V. All rights reserved.
Resumo:
This letter presents a new class of variational wavefunctions for Fermi systems in any dimension. These wavefunctions introduce correlations between Cooper pairs in different momentum states and the relevant correlations can be computed analytically. At half filling we have a ground state with critical superconducting correlations, that causes negligible increase of the kinetic energy. We find large enhancements in a Cooper-pair correlation function caused purely by the interplay between the uncertainty principle, repulsion and the proximity of half filling. This is surprising since there is no accompanying signature in usual charge and spin response functions, and typifies a novel kind of many-body cooperative behaviour.