973 resultados para Compressed text search


Relevância:

20.00% 20.00%

Publicador:

Resumo:

White-rot fungi are wood degrading organisms that are able to decompose all wood polymers; lignin, cellulose and hemicellulose. Especially the selective white-rot fungi that decompose preferentially wood lignin are promising for biopulping applications. In biopulping the pretreatment of wood chips with white-rot fungi enhances the subsequent pulping step and substantially reduces the refining energy consumption in mechanical pulping. Because it is not possible to carry out biopulping in industrial scale as a closed process it has been necessary to search for new selective strains of white-rot fungi which naturally occur in Finland and cause selective white-rot of Finnish wood raw-material. In a screening of 300 fungal strains a rare polypore, Physisporinus rivulosus strain T241i isolated from a forest burn research site, was found to be a selective lignin degrader and promising for the use in biopulping. Since selective lignin degradation is apparently essential for biopulping, knowledge on lignin-modifying enzymes and the regulation of their production by a biopulping fungus is needed. White-rot fungal enzymes that participate in lignin degradation are laccase, lignin peroxidase (LiP), manganese peroxidase (MnP), versatile peroxidase (VP) and hydrogen peroxide forming enzymes. In this study, P. rivulosus was observed to produce MnP, laccase and oxalic acid during growth on wood chips. In liquid cultures manganese and veratryl alcohol increased the production of acidic MnP isoforms detected also in wood chip cultures. Laccase production by P. rivulosus was low unless the cultures were supplemented with sawdust and charred wood, the components of natural growth environment of the fungus. In white-rot fungi the lignin-modifying enzymes are typically present as multiple isoforms. In this study, two MnP encoding genes, mnpA and mnpB, were cloned and characterized from P. rivulosus T241i. Analysis of the N-terminal amino acid sequences of two purified MnPs and putative amino acid sequence of the two cloned mnp genes suggested that P. rivulosus possesses at least four mnp genes. The genes mnpA and mnpB markedly differ from each other by the gene length, sequence and intron-exon structure. In addition, their expression is differentially affected by the addition of manganese and veratryl alcohol. P. rivulosus produced laccase as at least two isoforms. The results of this study revealed that the production of MnP and laccase was differentially regulated in P. rivulosus, which ensures the efficient lignin degradation under a variety of environmental conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heredity explains a major part of the variation in calcium homeostasis and bone strength, and the susceptibility to osteoporosis is polygenetically regulated. Bone phenotype results from the interplay between lifestyle and genes, and several nutritional factors modulate bone health throughout life. Thus, nutrigenetics examining the genetic variation in nutrient intake and homeostatic control is an important research area in the etiology of osteoporosis. Despite continuing progress in the search for candidate genes for osteoporosis, the results thus far have been inconclusive. The main objective of this thesis was to investigate the associations of lactase, vitamin D receptor (VDR), calcium sensing receptor (CaSR) and parathyroid hormone (PTH) gene polymorphisms and lifestyle factors and their interactions with bone health in Finns at varying stages of the skeletal life span. Markers of calcium homeostasis and bone remodelling were measured from blood and urine samples. Bone strength was measured at peripheral and central bone sites. Lifestyle factors were assessed with questionnaires and interviews. Genetic lactase non-persistence (the C/C-13910 genotype) was associated with lower consumption of milk from childhood, predisposing females in particular to inadequate calcium intake. Consumption of low-lactose milk and milk products was shown to decrease the risk for inadequate calcium intake. In young adulthood, bone loss was more common in males than in females. Males with the lactase C/C-13910 genotype may be more susceptible to bone loss than males with the other lactase genotypes, although calcium intake predicts changes in bone mass more than the lactase genotype. The BsmI and FokI polymorphisms of the VDR gene were associated with bone mass in growing adolescents, but the associations weakened with age. In young adults, the A986S polymorphism of the calcium sensing receptor gene was associated with serum ionized calcium concentrations, and the BstBI polymorphism of the parathyroid gene was related to bone strength. The FokI polymorphism and sodium intake showed an interaction effect on urinary calcium excretion. A novel gene-gene interaction between the VDR FokI and PTH BstBI gene polymorphisms was found in the regulation of PTH secretion and urinary calcium excretion. Further research should be carried out with more number of Finns at varying stages of the skeletal life span and more detailed measurements of bone strength. Research should concern mechanisms by which genetic variants affect calcium homeostasis and bone strength, and the role of diet-gene and gene-gene interactions in the pathogenesis of osteoporosis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The addition of guanosine 5-monophosphate (5′-GMP) to an aqueous solution of Mn2+ ions results in a decrease in ESR signal intensity and an increase in line-width of Mn2+ ions. This can be interpreted in terms of stepwise formation of outersphere and inner-sphere complexes as When Mg2+ is added to a mixture of Mn2+ and 5′-GMP, ESR signal intensity increases, presumably due to the replacement of Mn2+ by Mg2+ in the complex. From the variation of ESR signal intensity as a function of concentration of Mg2+, the product K1K2 for the magnesium complex i s calculated as 125 M−1. This difference in stability constants may indicate that both phosphate group and guanine base are involved in the formation of Mn2+-5′-GMP complex.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is the first of three books about the history of Geoffrey Lynfield's family. It is about four Lilienfeld brothers--Geoffrey Lynfield's grandfather and his brothers. They were born in the Jewish enclave of Marburg and ended up in South Africa when and where the first diamonds were discovered. The manuscript also includes photographs and documents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective Melanoma is on the rise, especially in Caucasian populations exposed to high ultraviolet radiation such as in Australia. This paper examined the psychological components facilitating change in skin cancer prevention or early detection behaviours following a text message intervention. Methods The Queensland-based participants were 18 to 42 years old, from the Healthy Text study (N = 546). Overall, 512 (94%) participants completed the 12-month follow-up questionnaires. Following the social cognitive model, potential mediators of skin self-examination (SSE) and sun protection behaviour change were examined using stepwise logistic regression models. Results At 12-month follow-up, odds of performing an SSE in the past 12 months were mediated by baseline confidence in finding time to check skin (an outcome expectation), with a change in odds ratio of 11.9% in the SSE group versus the control group when including the mediator. Odds of greater than average sun protective habits index at 12-month follow-up were mediated by (a) an attempt to get a suntan at baseline (an outcome expectation) and (b) baseline sun protective habits index, with a change in odds ratio of 10.0% and 11.8%, respectively in the SSE group versus the control group. Conclusions Few of the suspected mediation pathways were confirmed with the exception of outcome expectations and past behaviours. Future intervention programmes could use alternative theoretical models to elucidate how improvements in health behaviours can optimally be facilitated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Comprehensive two-dimensional gas chromatography (GC×GC) offers enhanced separation efficiency, reliability in qualitative and quantitative analysis, capability to detect low quantities, and information on the whole sample and its components. These features are essential in the analysis of complex samples, in which the number of compounds may be large or the analytes of interest are present at trace level. This study involved the development of instrumentation, data analysis programs and methodologies for GC×GC and their application in studies on qualitative and quantitative aspects of GC×GC analysis. Environmental samples were used as model samples. Instrumental development comprised the construction of three versions of a semi-rotating cryogenic modulator in which modulation was based on two-step cryogenic trapping with continuously flowing carbon dioxide as coolant. Two-step trapping was achieved by rotating the nozzle spraying the carbon dioxide with a motor. The fastest rotation and highest modulation frequency were achieved with a permanent magnetic motor, and modulation was most accurate when the motor was controlled with a microcontroller containing a quartz crystal. Heated wire resistors were unnecessary for the desorption step when liquid carbon dioxide was used as coolant. With use of the modulators developed in this study, the narrowest peaks were 75 ms at base. Three data analysis programs were developed allowing basic, comparison and identification operations. Basic operations enabled the visualisation of two-dimensional plots and the determination of retention times, peak heights and volumes. The overlaying feature in the comparison program allowed easy comparison of 2D plots. An automated identification procedure based on mass spectra and retention parameters allowed the qualitative analysis of data obtained by GC×GC and time-of-flight mass spectrometry. In the methodological development, sample preparation (extraction and clean-up) and GC×GC methods were developed for the analysis of atmospheric aerosol and sediment samples. Dynamic sonication assisted extraction was well suited for atmospheric aerosols collected on a filter. A clean-up procedure utilising normal phase liquid chromatography with ultra violet detection worked well in the removal of aliphatic hydrocarbons from a sediment extract. GC×GC with flame ionisation detection or quadrupole mass spectrometry provided good reliability in the qualitative analysis of target analytes. However, GC×GC with time-of-flight mass spectrometry was needed in the analysis of unknowns. The automated identification procedure that was developed was efficient in the analysis of large data files, but manual search and analyst knowledge are invaluable as well. Quantitative analysis was examined in terms of calibration procedures and the effect of matrix compounds on GC×GC separation. In addition to calibration in GC×GC with summed peak areas or peak volumes, simplified area calibration based on normal GC signal can be used to quantify compounds in samples analysed by GC×GC so long as certain qualitative and quantitative prerequisites are met. In a study of the effect of matrix compounds on GC×GC separation, it was shown that quality of the separation of PAHs is not significantly disturbed by the amount of matrix and quantitativeness suffers only slightly in the presence of matrix and when the amount of target compounds is low. The benefits of GC×GC in the analysis of complex samples easily overcome some minor drawbacks of the technique. The developed instrumentation and methodologies performed well for environmental samples, but they could also be applied for other complex samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study is to describe the development of application of mass spectrometry for the structural analyses of non-coding ribonucleic acids during past decade. Mass spectrometric methods are compared of traditional gel electrophoretic methods, the characteristics of performance of mass spectrometric, analyses are studied and the future trends of mass spectrometry of ribonucleic acids are discussed. Non-coding ribonucleic acids are short polymeric biomolecules which are not translated to proteins, but which may affect the gene expression in all organisms. Regulatory ribonucleic acids act through transient interactions with key molecules in signal transduction pathways. Interactions are mediated through specific secondary and tertiary structures. Posttranscriptional modifications in the structures of molecules may introduce new properties to the organism, such as adaptation to environmental changes or development of resistance to antibiotics. In the scope of this study, the structural studies include i) determination of the sequence of nucleobases in the polymer chain, ii) characterisation and localisation of posttranscriptional modifications in nucleobases and in the backbone structure, iii) identification of ribonucleic acid-binding molecules and iv) probing of higher order structures in the ribonucleic acid molecule. Bacteria, archaea, viruses and HeLa cancer cells have been used as target organisms. Synthesised ribonucleic acids consisting of structural regions of interest have been frequently used. Electrospray ionisation (ESI) and matrix-assisted laser desorption ionisation (MALDI) have been used for ionisation of ribonucleic analytes. Ammonium acetate and 2-propanol are common solvents for ESI. Trihydroxyacetophenone is the optimal MALDI matrix for ionisation of ribonucleic acids and peptides. Ammonium salts are used in ESI buffers and MALDI matrices as additives to remove cation adducts. Reverse phase high performance liquid chromatography has been used for desalting and fractionation of analytes either off-line of on-line, coupled with ESI source. Triethylamine and triethylammonium bicarbonate are used as ion pair reagents almost exclusively. Fourier transform ion cyclotron resonance analyser using ESI coupled with liquid chromatography is the platform of choice for all forms of structural analyses. Time-of-flight (TOF) analyser using MALDI may offer sensitive, easy-to-use and economical solution for simple sequencing of longer oligonucleotides and analyses of analyte mixtures without prior fractionation. Special analysis software is used for computer-aided interpretation of mass spectra. With mass spectrometry, sequences of 20-30 nucleotides of length may be determined unambiguously. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Sequencing in conjunction with other structural studies enables accurate localisation and characterisation of posttranscriptional modifications and identification of nucleobases and amino acids at the sites of interaction. High throughput screening methods for RNA-binding ligands have been developed. Probing of the higher order structures has provided supportive data for computer-generated three dimensional models of viral pseudoknots. In conclusion. mass spectrometric methods are well suited for structural analyses of small species of ribonucleic acids, such as short non-coding ribonucleic acids in the molecular size region of 20-30 nucleotides. Structural information not attainable with other methods of analyses, such as nuclear magnetic resonance and X-ray crystallography, may be obtained with the use of mass spectrometry. Sequencing may be applied to quality control of short synthetic oligomers for analytical purposes. Ligand screening may be used in the search of possible new therapeutic agents. Demanding assay design and challenging interpretation of data requires multidisclipinary knowledge. The implement of mass spectrometry to structural studies of ribonucleic acids is probably most efficiently conducted in specialist groups consisting of researchers from various fields of science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human-wildlife conflicts are today an integral part of the rural development discourse. In this research, the main focus is on the spatial explanation which is not a very common approach in the reviewed literature. My research hypothesis is based on the assumption that human-wildlife conflicts occur when a wild animal crosses a perceived borderline between the nature and culture and enters into the realms of the other. The borderline between nature and culture marks a perceived division of spatial content in our senses of place. The animal subject that crosses this border becomes a subject out of place meaning that the animal is then spatially located in a space where it should not be or where it does not belong according to tradition, custom, rules, law, public opinion, prevailing discourse or some other criteria set by human beings. An appearance of a wild animal in a domesticated space brings an uncontrolled subject into that space where humans have previously commanded total control of all other natural elements. A wild animal out of place may also threaten the biosecurity of the place in question. I carried out a case study in the Liwale district in south-eastern Tanzania to test my hypothesis during June and July 2002. I also collected documents and carried out interviews in Dar es Salaam in 2003. I studied the human-wildlife conflicts in six rural villages, where a total of 183 persons participated in the village meetings. My research methods included semi-structured interviews, participatory mapping, questionnaire survey and Q- methodology. The rural communities in the Liwale district have a long-history of co-existing with wildlife and they still have traditional knowledge of wildlife management and hunting. Wildlife conservation through the establishment of game reserves during the colonial era has escalated human-wildlife conflicts in the Liwale district. This study shows that the villagers perceive some wild animals differently in their images of the African countryside than the district and regional level civil servants do. From the small scale subsistence farmers point of views, wild animals continue to challenge the separation of the wild (the forests) and the domestics spaces (the cultivated fields) by moving across the perceived borders in search of food and shelter. As a result, the farmers may loose their crops, livestock or even their own lives in the confrontations of wild animals. Human-wildlife conflicts in the Liwale district are manifold and cannot be explained simply on the basis of attitudes or perceived images of landscapes. However, the spatial explanation of these conflicts provides us some more understanding of why human-wildlife conflicts are so widely found across the world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most countries of Europe, as well as many countries in other parts of the world, are experiencing an increased impact of natural hazards. It is often speculated, but not yet proven, that climate change might influence the frequency and magnitude of certain hydro-meteorological natural hazards. What has certainly been observed is a sharp increase in financial losses caused by natural hazards worldwide. Eventhough Europe appears to be a space that is not affected by natural hazards to such catastrophic extents as other parts of the world are, the damages experienced here are certainly increasing too. Natural hazards, climate change and, in particular, risks have therefore recently been put high on the political agenda of the EU. In the search for appropriate instruments for mitigating impacts of natural hazards and climate change, as well as risks, the integration of these factors into spatial planning practices is constantly receiving higher attention. The focus of most approaches lies on single hazards and climate change mitigation strategies. The current paradigm shift of climate change mitigation to adaptation is used as a basis to draw conclusions and recommendations on what concepts could be further incorporated into spatial planning practices. Especially multi-hazard approaches are discussed as an important approach that should be developed further. One focal point is the definition and applicability of the terms natural hazard, vulnerability and risk in spatial planning practices. Especially vulnerability and risk concepts are so many-fold and complicated that their application in spatial planning has to be analysed most carefully. The PhD thesis is based on six published articles that describe the results of European research projects, which have elaborated strategies and tools for integrated communication and assessment practices on natural hazards and climate change impacts. The papers describe approaches on local, regional and European level, both from theoretical and practical perspectives. Based on these, passed, current and future potential spatial planning applications are reviewed and discussed. In conclusion it is recommended to shift from single hazard assessments to multi-hazard approaches, integrating potential climate change impacts. Vulnerability concepts should play a stronger role than present, and adaptation to natural hazards and climate change should be more emphasized in relation to mitigation. It is outlined that the integration of risk concepts in planning is rather complicated and would need very careful assessment to ensure applicability. Future spatial planning practices should also consider to be more interdisciplinary, i.e. to integrate as many stakeholders and experts as possible to ensure the sustainability of investments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Leading Beautifully provides a new dimension to understanding effective leadership. Drawing from lessons in the arts and the humanities, English and Ehrich explore how educational decision-making in schools can be informed by identity, personal competence, and an understanding of the field's intellectual foundations. Based on in-depth interviews of artists and educational leaders, this book provides insight into the inner world of successful leaders who have developed competencies and understandings that extend beyond the standard leadership tool box. This exciting new book explores the theory and practice of leadership connoisseurship as a human-centered endeavor and as an antidote to mechanistic, business-oriented practices. The authors' well-grounded reconsideration of educational leadership will enliven and enhance any educational leader's practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Advancements in the analysis techniques have led to a rapid accumulation of biological data in databases. Such data often are in the form of sequences of observations, examples including DNA sequences and amino acid sequences of proteins. The scale and quality of the data give promises of answering various biologically relevant questions in more detail than what has been possible before. For example, one may wish to identify areas in an amino acid sequence, which are important for the function of the corresponding protein, or investigate how characteristics on the level of DNA sequence affect the adaptation of a bacterial species to its environment. Many of the interesting questions are intimately associated with the understanding of the evolutionary relationships among the items under consideration. The aim of this work is to develop novel statistical models and computational techniques to meet with the challenge of deriving meaning from the increasing amounts of data. Our main concern is on modeling the evolutionary relationships based on the observed molecular data. We operate within a Bayesian statistical framework, which allows a probabilistic quantification of the uncertainties related to a particular solution. As the basis of our modeling approach we utilize a partition model, which is used to describe the structure of data by appropriately dividing the data items into clusters of related items. Generalizations and modifications of the partition model are developed and applied to various problems. Large-scale data sets provide also a computational challenge. The models used to describe the data must be realistic enough to capture the essential features of the current modeling task but, at the same time, simple enough to make it possible to carry out the inference in practice. The partition model fulfills these two requirements. The problem-specific features can be taken into account by modifying the prior probability distributions of the model parameters. The computational efficiency stems from the ability to integrate out the parameters of the partition model analytically, which enables the use of efficient stochastic search algorithms.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method that yields optical Barker codes of smallest known lengths for given discrimination is described.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The following problem is considered. Given the locations of the Central Processing Unit (ar;the terminals which have to communicate with it, to determine the number and locations of the concentrators and to assign the terminals to the concentrators in such a way that the total cost is minimized. There is alao a fixed cost associated with each concentrator. There is ail upper limit to the number of terminals which can be connected to a concentrator. The terminals can be connected directly to the CPU also In this paper it is assumed that the concentrators can bo located anywhere in the area A containing the CPU and the terminals. Then this becomes a multimodal optimization problem. In the proposed algorithm a stochastic automaton is used as a search device to locate the minimum of the multimodal cost function . The proposed algorithm involves the following. The area A containing the CPU and the terminals is divided into an arbitrary number of regions (say K). An approximate value for the number of concentrators is assumed (say m). The optimum number is determined by iteration later The m concentrators can be assigned to the K regions in (mk) ways (m > K) or (km) ways (K>m).(All possible assignments are feasible, i.e. a region can contain 0,1,…, to concentrators). Each possible assignment is assumed to represent a state of the stochastic variable structure automaton. To start with, all the states are assigned equal probabilities. At each stage of the search the automaton visits a state according to the current probability distribution. At each visit the automaton selects a 'point' inside that state with uniform probability. The cost associated with that point is calculated and the average cost of that state is updated. Then the probabilities of all the states are updated. The probabilities are taken to bo inversely proportional to the average cost of the states After a certain number of searches the search probabilities become stationary and the automaton visits a particular state again and again. Then the automaton is said to have converged to that state Then by conducting a local gradient search within that state the exact locations of the concentrators are determined This algorithm was applied to a set of test problems and the results were compared with those given by Cooper's (1964, 1967) EAC algorithm and on the average it was found that the proposed algorithm performs better.