993 resultados para 1995_01312357 TM-70 4302905
Resumo:
Prescribing for older patients is challenging. The prevalence of diseases increases with advancing age and causes extensive drug use. Impairments in cognitive, sensory, social and physical functioning, multimorbidity and comorbidities, as well as age-related changes in pharmacokinetics and pharmacodynamics all add to the complexity of prescribing. This study is a cross-sectional assessment of all long-term residents aged ≥ 65 years in all nursing homes in Helsinki, Finland. The residents’ health status was assessed and data on their demographic factors, health and medications were collected from their medical records in February 2003. This study assesses some essential issues in prescribing for older people: psychotropic drugs (Paper I), laxatives (Paper II), vitamin D and calcium supplements (Paper III), potentially inappropriate drugs for older adults (PIDs) and drug-drug interactions (DDIs)(Paper IV), as well as prescribing in public and private nursing homes. A resident was classified as a medication user if his or her medication record indicated a regular sequence for its dosage. Others were classified as non-users. Mini Nutritional Assessment (MNA) was used to assess residents’ nutritional status, Beers 2003 criteria to assess the use of PIDs, and the Swedish, Finnish, INteraction X-referencing database (SFINX) to evaluate their exposure to DDIs. Of all nursing home residents in Helsinki, 82% (n=1987) participated in studies I, II, and IV and 87% (n=2114) participated in the study III. The residents’ mean age was 84 years, 81% were female, and 70% were diagnosed with dementia. The mean number of drugs was 7.9 per resident; 40% of the residents used ≥ 9 drugs per day, and were thus exposed to polypharmacy. Eighty percent of the residents received psychotropics; 43% received antipsychotics, and 45% used antidepressants. Anxiolytics were prescribed to 26%, and hypnotics to 28% of the residents. Of those residents diagnosed with dementia, 11% received antidementia drugs. Fifty five percent of the residents used laxatives regularly. In multivariate analysis, those factors associated with regular laxative use were advanced age, immobility, poor nutritional status, chewing problems, Parkinson’s disease, and a high number of drugs. Eating snacks between meals was associated with lower risk for laxative use. Of all participants, 33% received vitamin D supplementation, 28% received calcium supplementation, and 20% received both vitamin D and calcium. The dosage of vitamin D was rather low: 21% received vitamin D 400 IU (10 µg) or more, and only 4% received 800 IU (20 µg) or more. In multivariate analysis, residents who received vitamin D supplementation enjoyed better nutritional status, ate snacks between meals, suffered no constipation, and received regular weight monitoring. Those residents receiving PIDs (34% of all residents) more often used psychotropic medication and were more often exposed to polypharmacy than residents receiving no PIDs. Residents receiving PIDs were less often diagnosed with dementia than were residents receiving no PIDs. The three most prevalent PIDs were short-acting benzodiazepine in greater dosages than recommended, hydroxyzine, and nitrofurantoin. These three drugs accounted for nearly 77% of all PID use. Of all residents, less than 5% were susceptible to a clinically significant DDI. The most common DDIs were related to the use of potassium-sparing diuretics, carbamazepine, and codeine. Residents exposed to potential DDIs were younger, had more often suffered a previous stroke, more often used psychotropics, and were more often exposed to PIDs and polypharmacy than were residents not exposed to DDIs. Residents in private nursing homes were less often exposed to polypharmacy than were residents in public nursing homes. Long-term residents in nursing homes in Helsinki use, on average, nearly eight drugs daily. The use of psychotropic drugs in our study was notably more common than in international studies. The prevalence of laxatives equaled other prior international studies. Regardless of the known benefit and recommendation of vitamin D supplementation for elderly residing mostly indoors, the proportion of nursing home residents receiving vitamin D and calcium was surprisingly low. The use of PIDs was common among nursing home residents. PIDs increased the likelihood of DDIs. However, DDIs did not seem a major concern among the nursing home population. Monitoring PIDs and potential drug interactions could improve the quality of prescribing.
Resumo:
Objectives To investigate medication changes for older patients admitted to hospital and to explore associations between patient characteristics and polypharmacy. Design Prospective cohort study. Participants and setting Patients aged 70 years or older admitted to general medical units of 11 acute care hospitals in two Australian states between July 2005 and May 2010. All patients were assessed using the interRAI assessment system for acute care. Main outcome measures Measures of physical, cognitive and psychosocial functioning; and number of regular prescribed medications categorised into three groups: non-polypharmacy (0–4 drugs), polypharmacy (5–9 drugs) and hyperpolypharmacy (≥ 10 drugs). Results Of 1220 patients who were recruited for the study, medication records at admission were available for 1216. Mean age was 81.3 years (SD, 6.8 years), and 659 patients (54.2%) were women. For the 1187 patients with complete medication records on admission and discharge, there was a small but statistically significant increase in mean number of regular medications per day between admission and discharge (7.1 v 7.6), while the prevalence of medications such as statins (459 [38.7%] v 457 [38.5%] patients), opioid analgesics (155 [13.1%] v 166 [14.0%] patients), antipsychotics (59 [5.0%] v 65 [5.5%] patients) and benzodiazepines (122 [10.3%] v 135 [11.4%] patients) did not change significantly. Being in a higher polypharmacy category was significantly associated with increase in comorbidities (odds ratio [OR], 1.27; 95% CI, 1.20–1.34), presence of pain (OR, 1.31; 1.05–1.64), dyspnoea (OR, 1.64; 1.30–2.07) and dependence in terms of instrumental activities of daily living (OR, 1.70; 1.20–2.41). Hyperpolypharmacy was observed in 290/1216 patients (23.8%) at admission and 336/1187 patients (28.3%) on discharge, and the proportion of preventive medication in the hyperpolypharmacy category at both points in time remained high (1209/3371 [35.9%] at admission v 1508/4117 [36.6%] at discharge). Conclusions Polypharmacy is common among older people admitted to general medical units of Australian hospitals, with no clinically meaningful change to the number or classification (symptom control, prevention or both) of drugs made by treating physicians.
Resumo:
Glaucoma is the second leading cause of blindness worldwide. It is a group of optic neuropathies, characterized by progressive optic nerve degeneration, excavation of the optic disc due to apoptosis of retinal ganglion cells and corresponding visual field defects. Open angle glaucoma (OAG) is a subtype of glaucoma, classified according to the age of onset into juvenile and adult- forms with a cut-off point of 40 years of age. The prevalence of OAG is 1-2% of the population over 40 years and increases with age. During the last decade several candidate loci and three candidate genes, myocilin (MYOC), optineurin (OPTN) and WD40-repeat 36 (WDR36), for OAG have been identified. Exfoliation syndrome (XFS), age, elevated intraocular pressure and genetic predisposition are known risk factors for OAG. XFS is characterized by accumulation of grayish scales of fibrillogranular extracellular material in the anterior segment of the eye. XFS is overall the most common identifiable cause of glaucoma (exfoliation glaucoma, XFG). In the past year, three single nucleotide polymorphisms (SNPs) on the lysyl oxidase like 1 (LOXL1) gene have been associated with XFS and XFG in several populations. This thesis describes the first molecular genetic studies of OAG and XFS/XFG in the Finnish population. The role of the MYOC and OPTN genes and fourteen candidate loci was investigated in eight Finnish glaucoma families. Both candidate genes and loci were excluded in families, further confirming the heterogeneous nature of OAG. To investigate the genetic basis of glaucoma in a large Finnish family with juvenile and adult onset OAG, we analysed the MYOC gene in family members. Glaucoma associated mutation (Thr377Met) was identified in the MYOC gene segregating with the disease in the family. This finding has great significance for the family and encourages investigating the MYOC gene also in other Finnish OAG families. In order to identify the genetic susceptibility loci for XFS, we carried out a genome-wide scan in the extended Finnish XFS family. This scan produced promising candidate locus on chromosomal region 18q12.1-21.33 and several additional putative susceptibility loci for XFS. This locus on chromosome 18 provides a solid starting point for the fine-scale mapping studies, which are needed to identify variants conferring susceptibility to XFS in the region. A case-control and family-based association study and family-based linkage study was performed to evaluate whether SNPs in the LOXL1 gene contain a risk for XFS, XFG or POAG in the Finnish patients. A significant association between the LOXL1 gene SNPs and XFS and XFG was confirmed in the Finnish population. However, no association was detected with POAG. Probably also other genetic and environmental factors are involved in the pathogenesis of XFS and XFG.
Resumo:
Schizophrenia is a severe mental disorder affecting 0.4-1% of the population worldwide. It is characterized by impairments in the perception of reality and by significant social or occupational dysfunction. The disorder is one of the major contributors to the global burden of diseases. Studies of twins, families, and adopted children point to strong genetic components for schizophrenia, but environmental factors also play a role in the pathogenesis of disease. Molecular genetic studies have identified several potential positional candidate genes. The strongest evidence for putative schizophrenia susceptibility loci relates to the genes encoding dysbindin (DTNBP1) and neuregulin (NRG1), but studies lack impressive consistency in the precise genetic regions and alleles implicated. We have studied the role of three potential candidate genes by genotyping 28 single nucleotide polymorphisms in the DNTBP1, NRG1, and AKT1 genes in a large schizophrenia family sample consisting of 441 families with 865 affected individuals from Finland. Our results do not support a major role for these genes in the pathogenesis of schizophrenia in Finland. We have previously identified a region on chromosome 5q21-34 as a susceptibility locus for schizophrenia in a Finnish family sample. Recently, two studies reported association between the γ-aminobutyric acid type A receptor cluster of genes in this region and one study showed suggestive evidence for association with another regional gene encoding clathrin interactor 1 (CLINT1, also called Epsin 4 and ENTH). To further address the significance of these genes under the linkage peak in the Finnish families, we genotyped SNPs of these genes, and observed statistically significant association of variants between GABRG2 and schizophrenia. Furthermore, these variants also seem to affect the functioning of the working memory. Fetal events and obstetric complications are associated with schizophrenia. Rh incompatibility has been implicated as a risk factor for schizophrenia in several epidemiological studies. We conducted a family-based candidate-gene study that assessed the role of maternal-fetal genotype incompatibility at the RhD locus in schizophrenia. There was significant evidence for an RhD maternal-fetal genotype incompatibility, and the risk ratio was estimated at 2.3. This is the first candidate-gene study to explicitly test for and provide evidence of a maternal-fetal genotype incompatibility mechanism in schizophrenia. In conclusion, in this thesis we found evidence that one GABA receptor subunit, GABRG2, is significantly associated with schizophrenia. Furthermore, it also seems to affect to the functioning of the working memory. In addition, an RhD maternal-fetal genotype incompatibility increases the risk of schizophrenia by two-fold.
Resumo:
The issue of the usefulness of different prosopis species versus their status as weeds is a matter of hot debate around the world. The tree Prosopis juliflora had until 2000 been proclaimed weedy in its native range in South America and elsewhere in the dry tropics. P. juliflora or mesquite has a 90-year history in Sudan. During the early 1990s a popular opinion in central Sudan and the Sudanese Government had begun to consider prosopis a noxious weed and a problematic tree species due to its aggressive ability to invade farmlands and pastures, especially in and around irrigated agricultural lands. As a consequence prosopis was officially declared an invasive alien species also in Sudan, and in 1995 a presidential decree for its eradication was issued. Using a total economic valuation (TEV) approach, this study analysed the impacts of prosopis on the local livelihoods in two contrasting irrigated agricultural schemes. Primarily a problem-based approach was used in which the derivation of non-market values was captured using ecological economic tools. In the New Halfa Irrigation Scheme in Kassala State, four separate household surveys were conducted due to diversity between the respective population groups. The main aim was here to study the magnitude of environmental economic benefits and costs derived from the invasion of prosopis in a large agricultural irrigation scheme on clay soil. Another study site, the Gandato Irrigation Scheme in River Nile State represented impacts from prosopis that an irrigation scheme was confronted with on sandy soil in the arid and semi-arid ecozones along the main River Nile. The two cases showed distinctly different effects of prosopis but both indicated the benefits to exceed the costs. The valuation on clay soil in New Halfa identified a benefit/cost ratio of 2.1, while this indicator equalled 46 on the sandy soils of Gandato. The valuation results were site-specific and based on local market prices. The most important beneficial impacts of prosopis on local livelihoods were derived from free-grazing forage for livestock, environmental conservation of the native vegetation, wood and non-wood forest products, as well as shelterbelt effects. The main social costs from prosopis were derived from weeding and clearing it from farm lands and from canalsides, from thorn injuries to humans and livestock, as well as from repair expenses vehicle tyre punctures. Of the population groups, the tenants faced most of the detrimental impacts, while the landless population groups (originating from western and eastern Sudan) as well as the nomads were highly dependent on this tree resource. For the Gandato site the monetized benefit-cost ratio of 46 still excluded several additional beneficial impacts of prosopis in the area that were difficult to quantify and monetize credibly. In River Nile State the beneficial impact could thus be seen as completely outweighing the costs of prosopis. The results can contributed to the formulation of national and local forest and agricultural policies related to prosopis in Sudan and also be used in other countries faced with similar impacts caused by this tree.
Resumo:
Old trees growing in urban environments are often felled due to symptoms of mechanical defects that could be hazardous to people and property. The decisions concerning these removals are justified by risk assessments carried out by tree care professionals. The major motivation for this study was to determine the most common profiles of potential hazard characteristics for the three most common urban tree genera in Helsinki City: Tilia, Betula and Acer, and in this way improve management practices and protection of old amenity trees. For this research, material from approximately 250 urban trees was collected in cooperation with the City of Helsinki Public Works Department during 2001 - 2004. From the total number of trees sampled, approximately 70% were defined as hazardous. The tree species had characteristic features as potential hazard profiles. For Tilia trees, hollowed heartwood with low fungal activity and advanced decay caused by Ganoderma lipsiense were the two most common profiles. In Betula spp., the primary reason for tree removal was usually lowered amenity value in terms of decline of the crown. Internal cracks, most often due to weak fork formation, were common causes of potential failure in Acer spp. Decay caused by Rigidoporus populinus often increased the risk of stem breakage in these Acer trees. Of the decay fungi observed, G. lipsiense was most often the reason for the increased risk of stem collapse. Other fungi that also caused extensive decay were R. populinus, Inonotus obliquus, Kretzschmaria deusta and Phellinus igniarius. The most common decay fungi in terms of incidence were Pholiota spp., but decay caused by these species did not have a high potential for causing stem breakage, because it rarely extended to the cambium. The various evaluations used in the study suggested contradictions in felling decisions based on trees displaying different stages of decay. For protection of old urban trees, it is crucial to develop monitoring methods so that tree care professionals could better analyse the rate of decay progression towards the sapwood and separate those trees with decreasing amounts of sound wood from those with decay that is restricted to the heartwood area.
Resumo:
Spring barley is the most important crop in Finland based on cultivated land area. Net blotch, a disease caused by Pyrenophora teres Drech., is the most damaging disease of barley in Finland. The pressure to improve the economics and efficiency of agriculture has increased the need for more efficient plant protection methods. Development of durable host-plant resistance to net blotch is a promising possibility. However, deployment of disease resistant crops could initiate selection pressure on the pathogen (P. teres) population. The aim of this study was to understand the population biology of P. teres and to estimate the evolutionary potential of P. teres under selective pressure following deployment of resistance genes and application of fungicides. The study included mainly Finnish P. teres isolates. Population samples from Russia and Australia were also included. Using AFLP markers substantial genotypic variation in P. teres populations was identified. Differences among isolates were least within Finnish fields and significantly higher in Krasnodar, Russia. Genetic differentiation was identified among populations from northern Europe and from Australia, and between the two forms P. teres f. teres (PTT, net form of net blotch) and P. teres f. maculata (PTM, spot form of net blotch) in Australia. Differentiation among populations was also identified based on virulence between Finnish and Russian populations, and based on prochloraz (fungicide) tolerance in the Häme region in Finland. Surprisingly only PTT was recovered from Finland and Russia although both forms were earlier equally common in Finland. The reason for the shift in occurrence of forms in Finland remained uncertain. Both forms were found within several fields in Australia. Sexual reproduction of P. teres was supported by recover of both mating types in equal ratio in those areas although the prevalence of sexual mating seems to be less in Finland than in Australia. Population from Krasnodar was an exception since only one mating type was found in there. Based on the substantial high genotypic variation in Krasnodar it was suggested go represent an old P. teres population, whereas the Australian samples were suggested to represent newer populations. In conclusion, P. teres populations are differentiated at several levels. Human assistance in dispersal of P. teres on infected barley seed is obvious and decreases the differentiation among populations. This can increase the plant protection problems caused by this pathogen. P. teres is capable of sexual reproduction in several areas but the prevalence varies. Based on these findings it is apparent that P. teres has the potential to pose more serious problems in barley cultivation if plant protection is neglected. Therefore, good agricultural practices, including crop rotation and the use of healthy seed, are recommended.
Resumo:
My work describes two sectors of the human bacterial environment: 1. The sources of exposure to infectious non-tuberculous mycobacteria. 2. Bacteria in dust, reflecting the airborne bacterial exposure in environments protecting from or predisposing to allergic disorders. Non-tuberculous mycobacteria (NTM) transmit to humans and animals from the environment. Infection by NTM in Finland has increased during the past decade beyond that by Mycobacterium tuberculosis. Among the farm animals, porcine mycobacteriosis is the predominant NTM disease in Finland. Symptoms of mycobacteriosis are found in 0.34 % of slaughtered pigs. Soil and drinking water are suspected as sources for humans and bedding materials for pigs. To achieve quantitative data on the sources of human and porcine NTM exposure, methods for quantitation of environmental NTM are needed. We developed a quantitative real-time PCR method, utilizing primers targeted at the 16S rRNA gene of the genus of Mycobacterium. With this method, I found in Finnish sphagnum peat, sandy soils and mud high contents of mycobacterial DNA, 106 to 107 genome equivalents per gram. A similar result was obtained by a method based on the Mycobacterium-specific hybridization of 16S rRNA. Since rRNA is found mainly in live cells, this result shows that the DNA detected by qPCR mainly represented live mycobacteria. Next, I investigated the occurrence of environmental mycobacteria in the bedding materials obtained from 5 pig farms with high prevalence (>4 %) of mycobacteriosis. When I used for quantification the same qPCR methods as for the soils, I found that piggery samples contained non-mycobacterial DNA that was amplified in spite of several mismatches with the primers. I therefore improved the qPCR assay by designing Mycobacterium-specific detection probes. Using the probe qPCR assay, I found 105 to 107 genome equivalents of mycobacterial DNA in unused bedding materials and up to 1000 fold more in the bedding collected after use in the piggery. This result shows that there was a source of mycobacteria in the bedding materials purchased by the piggery and that mycobacteria increased in the bedding materials during use in the piggery. Allergic diseases have reached epidemic proportions in urbanized countries. At the same time, childhood in rural environment or simple living conditions appears to protect against allergic disorders. Exposure to immunoreactive microbial components in rural environments seems to prevent allergies. I searched for differences in the bacterial communities of two indoor dusts, an urban house dust shown to possess immunoreactivity of the TH2-type and a farm barn dust with TH1-activity. The immunoreactivities of the dusts were revealed by my collaborators, in vitro in human dendritic cells and in vivo in mouse. The dusts accumulated >10 years in the respiratory zone (>1.5 m above floor), thus reflecting the long-term content of airborne bacteria at the two sites. I investigated these dusts by cloning and sequencing of bacterial 16S rRNA genes from dust contained DNA. From the TH2-active urban house dust, I isolated 139 16S rRNA gene clones. The most prevalent genera among the clones were Corynebacterium (5 species, 34 clones), Streptococcus (8 species, 33 clones), Staphylococcus (5 species, 9 clones) and Finegoldia (1 species, 9 clones). Almost all of these species are known as colonizers of the human skin and oral cavity. Species of Corynebacterium and Streptococcus have been reported to contain anti-inflammatory lipoarabinomannans and immunmoreactive beta-glucans respectively. Streptococcus mitis, found in the urban house dust is known as an inducer of TH2 polarized immunity, characteristic of allergic disorders. I isolated 152 DNA clones from the TH1-active farm barn dust and found species quite different from those found from the urban house dust. Among others, I found DNA clones representing Bacillus licheniformis, Acinetobacter lwoffii and Lactobacillus each of which was recently reported to possess anti-allergy immunoreactivity. Moreover, the farm barn dust contained dramatically higher bacterial diversity than the urban house dust. Exposure to this dust thus stimulated the human dendritic cells by multiple microbial components. Such stimulation was reported to promote TH1 immunity. The biodiversity in dust may thus be connected to its immunoreactivity. Furthermore, the bacterial biomass in the farm barn dust consisted of live intact bacteria mainly. In the urban house dust only ~1 % of the biomass appeared as intact bacteria, as judged by microscoping. Fragmented microbes may possess bioactivity different from that of intact cells. This was recently shown for moulds. If this is also valid for bacteria, the different immunoreactivities of the two dusts may be explained by the intactness of dustborne bacteria. Based on these results, we offer three factors potentially contributing to the polarized immunoreactivities of the two dusts: (i) the species-composition, (ii) the biodiversity and (iii) the intactness of the dustborne bacterial biomass. The risk of childhood atopic diseases is 4-fold lower in the Russian compared with the Finnish Karelia. This difference across the country border is not explainable by different geo-climatic factors or genetic susceptibilities of the two populations. Instead, the explanation must be lifestyle-related. It has already been reported that the microbiological quality of drinking water differs on the two sides of the borders. In collaboration with allergists, I investigated dusts collected from homes in the Russian Karelia and in the Finnish Karelia. I found that bacterial 16S rRNA genes cloned from the Russian Karelian dusts (10 homes, 234 clones) predominantly represented Gram-positive taxa (the phyla Actinobacteria and Firmicutes, 67%). The Russian Karelian dusts contained nine-fold more of muramic acid (60 to 70 ng mg-1) than the Finnish Karelian dusts (3 to 11 ng mg-1). Among the DNA clones isolated from the Finnish side (n=231), Gram-negative taxa (40%) outnumbered the Gram-positives (34%). Out of the 465 DNA clones isolated from the Karelian dusts, 242 were assigned to cultured validly described bacterial species. In Russian Karelia, animal-associated species e.g. Staphylococcus and Macrococcus were numerous (27 clones, 14 unique species). This finding may connect to the difference in the prevalence of allergy, as childhood contacts with pets and farm animals have been connected with low allergy risk. Plant-associated bacteria and plant-borne 16S rRNA genes (chloroplast) were frequent among the DNA clones isolated from the Finnish Karelia, indicating components originating from plants. In conclusion, my work revealed three major differences between the bacterial communtites in the Russian and in the Finnish Karelian homes: (i) the high prevalence of Gram-positive bacteria on the Russian side and of Gram-negative bacteria on the Finnish side and (ii) the rich presence of animal-associated bacteria on the Russian side whereas (iii) plant-associated bacteria prevailed on the Finnish side. One or several of these factors may connect to the differences in the prevalence of allergy.
Resumo:
African indigenous foods have received limited research. Most of these indigenous foods are fermented and they form part of the rich nutritional culture of many groups in African countries. The industrialization and commercialisation of these indigenous African fermented foods should be preceded by a thorough scientific knowledge of their processing which can be vital in the elimination of hunger and poverty. This study highlighted emerging developments and the microbiology of cereal-based and cassava-based food products that constitute a major part of the human diet in most African countries. In addition, investigations were also carried out on the coagulant of the Calotropis procera plant used in traditional production of Nigerian Wara cheese and on the effects of adding a nisin producing Lactococcus lactis strain originating from human milk to Nigerian Wara cheese. Fermented cereal-based food such as ogi utilize popular African and readily available grains maize, millet or sorghum as substrates and is popular as a weaning diet in infants. In this study, the bulkiness caused by starch gelatinization was solved by amylase treatments in the investigation on cooked and fermented oat bran porridge. A similar treatment could reduce the viscosity of any cereal porridge. The properties of the Sodom apple leaves (Calotropis procera) extract in cheesemaking were studied. C. procera was affected by monovalent (K+ and Na+) and divalent (Mg2+ and Ca2+) cations during coagulation. The rennet strength of this coagulant was found to be 7 % compared to animal rennet at 35 °C. Increasing the incubation temperature to 70 °C increased the rennet strength 28-fold. The molecular weight of the partially purified protease was determined by SDS-PAGE and was confirmed by Zymography to be approximately 60 kilodaltons. The high proteolytic activity at 70 °C supported the suitability of the protease enzyme as a coagulant in future commercial production of Nigerian Wara cheese. It was also possible to extend the shelf life of Wara cheese by a nisin producing lactic acid bacteria Lactococcus lactis LAC309. The levels of nisin in both whey and curd fractions of Wara were investigated, results showed a 3 log reduction of toxicogenic Bacillus licheniformis spiked on Wara after 3 days. These studies are the first in Finland to promote the advancement of scientific knowledge in African foods. Recognizing these indigenous food products and an efficient transfer of technology from the developed countries to industrialize them are necessary towards a successful realization of the United Nations Millenium Development Program.
Resumo:
Phase diagrams for Tm2O3-H2O-CO2. Yb2O3-H2O-CO2 and Lu2O3-H2O-CO2 systems at 650 and 1300 bars have been investigated in the temperature range of 100–800°C. The phase diagrams are far more complex than those for the lighter lanthanides. The stable phases are Ln(OH)3, Ln2(CO3)3.3H2O (tengerite phase), orthorhombic-LnOHCO3, hexagonal-Ln2O2CO3. LnOOH and cubic-Ln2O3. Ln(OH)3 is stable only at very low partial pressures of CO2. Additional phases stabilised are Ln2O(OH)2CO3and Ln6(OH)4(CO3)7 which are absent in lighter lanthanide systems. Other phases, isolated in the presence of minor alkali impurities, are Ln6O2(OH)8(CO3)3. Ln4(OH)6(CO3)3 and Ln12O7(OH)10,(CO3)6. The chemical equilibria prevailing in these hydrothermal systems may be best explained on the basis of the four-fold classification of lanthanides.
Resumo:
Rural income diversification has been found to be rather the norm than the exception in developing countries. Smallholder households tend to diversify their income sources because of the need to manage risks, secure a smooth flow of income, allocate surplus labour, respond to various kinds of market failures, and apply coping strategies. The Agricultural Household Model provides a theoretical rationale for income diversification in that rural households aim at maximising their utility. There are several elements involved, such as agricultural production for their own consumption and markets, leisure activities and income from non-farm sources. The aim of the present study is to enhance understanding of the processes of rural income generation and diversification in eastern Zambia. Specifically, it explores the relationship between household characteristics, asset endowments and income-generation patterns. According to the sustainable- rural-livelihoods framework, the assets a household possesses shape its capacity to seize new economic opportunities. The study is based on two surveys conducted among rural smallholder households in four districts of Eastern Province in Zambia in 1985/86 and 2003. Sixty-seven of the interviewed households were present in both surveys and this panel allows comparison between the two points of time. The initial descriptive analysis is complemented with an econometric analysis of the relationships between household assets and income sources. The results show that, on average, 30 per cent of the households income originated from sources outside their own agriculture. There was a slight increase in the proportion of non-farm income from 1985/86 to 2003, but total income clearly declined mainly on account of diminishing crop income. The land area the household was able to cultivate, which is often dependent on the available labour, was the most significant factor affecting both the household-income level and the diversification patterns. Diversification was, in most cases, a coping strategy rather than a voluntary choice. Measured as income/capita/day, all households were below the poverty line in 2003. The agricultural reforms in Zambia, combined with other trends such as changes in rainfall pattern, the worsening livestock situation and the incidence of human disease, had a negative impact on agricultural productivity and income between 1985/86 and 2003. Sources of non-farm income were closely linked to agriculture either upstream or downstream and the income they generated was not enough to compensate for the decline of agricultural income. Household assets and characteristics had a smaller impact on diversification patterns than expected, which could reflect the lack of opportunities in the remote rural environment.
Resumo:
Options for the integrated management of white blister (caused by Albugo candida) of Brassica crops include the use of well timed overhead irrigation, resistant cultivars, programs of weekly fungicide sprays or strategic fungicide applications based on the disease risk prediction model, Brassica(spot)(TM). Initial systematic surveys of radish producers near Melbourne, Victoria, indicated that crops irrigated overhead in the morning (0800-1200 h) had a lower incidence of white blister than those irrigated overhead in the evening (2000-2400 h). A field trial was conducted from July to November 2008 on a broccoli crop located west of Melbourne to determine the efficacy and economics of different practices used for white blister control, modifying irrigation timing, growing a resistant cultivar and timing spray applications based on Brassica(spot)(TM). Growing the resistant cultivar, 'Tyson', instead of the susceptible cultivar, 'Ironman', reduced disease incidence on broccoli heads by 99 %. Overhead irrigation at 0400 h instead of 2000 h reduced disease incidence by 58 %. A weekly spray program or a spray regime based on either of two versions of the Brassica(spot)(TM) model provided similar disease control and reduced disease incidence by 72 to 83 %. However, use of the Brassica(spot)(TM) models greatly reduced the number of sprays required for control from 14 to one or two. An economic analysis showed that growing the more resistant cultivar increased farm profit per ha by 12 %, choosing morning irrigation by 3 % and using the disease risk predictive models compared with weekly sprays by 15 %. The disease risk predictive models were 4 % more profitable than the unsprayed control.
Resumo:
Multi- and intralake datasets of fossil midge assemblages in surface sediments of small shallow lakes in Finland were studied to determine the most important environmental factors explaining trends in midge distribution and abundance. The aim was to develop palaeoenvironmental calibration models for the most important environmental variables for the purpose of reconstructing past environmental conditions. The developed models were applied to three high-resolution fossil midge stratigraphies from southern and eastern Finland to interpret environmental variability over the past 2000 years, with special focus on the Medieval Climate Anomaly (MCA), the Little Ice Age (LIA) and recent anthropogenic changes. The midge-based results were compared with physical properties of the sediment, historical evidence and environmental reconstructions based on diatoms (Bacillariophyta), cladocerans (Crustacea: Cladocera) and tree rings. The results showed that the most important environmental factor controlling midge distribution and abundance along a latitudinal gradient in Finland was the mean July air temperature (TJul). However, when the dataset was environmentally screened to include only pristine lakes, water depth at the sampling site became more important. Furthermore, when the dataset was geographically scaled to southern Finland, hypolimnetic oxygen conditions became the dominant environmental factor. The results from an intralake dataset from eastern Finland showed that the most important environmental factors controlling midge distribution within a lake basin were river contribution, water depth and submerged vegetation patterns. In addition, the results of the intralake dataset showed that the fossil midge assemblages represent fauna that lived in close proximity to the sampling sites, thus enabling the exploration of within-lake gradients in midge assemblages. Importantly, this within-lake heterogeneity in midge assemblages may have effects on midge-based temperature estimations, because samples taken from the deepest point of a lake basin may infer considerably colder temperatures than expected, as shown by the present test results. Therefore, it is suggested here that the samples in fossil midge studies involving shallow boreal lakes should be taken from the sublittoral, where the assemblages are most representative of the whole lake fauna. Transfer functions between midge assemblages and the environmental forcing factors that were significantly related with the assemblages, including mean air TJul, water depth, hypolimnetic oxygen, stream flow and distance to littoral vegetation, were developed using weighted averaging (WA) and weighted averaging-partial least squares (WA-PLS) techniques, which outperformed all the other tested numerical approaches. Application of the models in downcore studies showed mostly consistent trends. Based on the present results, which agreed with previous studies and historical evidence, the Medieval Climate Anomaly between ca. 800 and 1300 AD in eastern Finland was characterized by warm temperature conditions and dry summers, but probably humid winters. The Little Ice Age (LIA) prevailed in southern Finland from ca. 1550 to 1850 AD, with the coldest conditions occurring at ca. 1700 AD, whereas in eastern Finland the cold conditions prevailed over a longer time period, from ca. 1300 until 1900 AD. The recent climatic warming was clearly represented in all of the temperature reconstructions. In the terms of long-term climatology, the present results provide support for the concept that the North Atlantic Oscillation (NAO) index has a positive correlation with winter precipitation and annual temperature and a negative correlation with summer precipitation in eastern Finland. In general, the results indicate a relatively warm climate with dry summers but snowy winters during the MCA and a cool climate with rainy summers and dry winters during the LIA. The results of the present reconstructions and the forthcoming applications of the models can be used in assessments of long-term environmental dynamics to refine the understanding of past environmental reference conditions and natural variability required by environmental scientists, ecologists and policy makers to make decisions concerning the presently occurring global, regional and local changes. The developed midge-based models for temperature, hypolimnetic oxygen, water depth, littoral vegetation shift and stream flow, presented in this thesis, are open for scientific use on request.
Resumo:
The homogeneous serine hydroxymethyltransferase from monkey liver was optimally activate at 60°C and the Arrhenius plot for the enzyme was nonlinear with a break at 15°C. The monkey liver enzyme showed high thermal stability of 62°C, as monitored by circular dichroism at 222 nm, absorbance at 280 nm and enzyme activity. The enzyme exhibited a sharp co-operative thermal transition in the range of 50°-70° (Tm= 65°C), as monitored by circular dichroism. L-Serine protected the enzyme against both thermal inactivation and thermal disruption of the secondary structure. The homotropic interactions of tetrahydrofolate with the enzyme was abolished at high temperatures (at 70°C, the Hill coefficient value was 1.0). A plot of h values vs. assay temperature of tetrahydrofolate saturation experiments, showed the presence of an intermediate conformer with an h value of 1.7 in the temperature range of 45°-60°C. Inclusion of a heat denaturation step in the scheme employed for the purification of serine hydroxymethyltransferase resulted in the loss of cooperative interactions with tetrahydrofolate. The temperature effects on the serine hydroxylmethyltransferase, reported for the first time, lead to a better understanding of the heat induced alterations in conformation and activity for this oligomeric protein.
Resumo:
Digital image