18 resultados para seedling vigor and speed of germination index

em Helda - Digital Repository of University of Helsinki


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Type 2 diabetes is one of the diseases that largely determined by lifestyle factors. Coffee is one of the most consumed beverages in the world and recently released data suggest the effects of coffee consumption on type 2 diabetes. The objective of the present study was to evaluate the effects of habitual coffee consumption on various aspects of type 2 diabetes and its most common complications. This study is part of the national FINRISK studies. Baseline surveys were carried out between 1972 and 1997. The surveys covered two eastern regions in 1972 and 1977, but were expanded to include a third region in southwestern Finland in 1982, 1987, 1992, and 1997. The Helsinki capital area was included in the survey in 1992 and 1997 and the Oulu province, in northern Finland, in 1997. Each survey was drawn from an independent random sample of the national register of subjects aged 25-64. In 1997, an additional sample of subjects aged 65-74 was conducted. The blood pressure, weight, and height of subjects were measured. By using self-administered questionnaires data were collected on medical history, socioeconomic factors, physical activity, smoking habits, and alcohol, coffee, and tea consumption. Higher coffee consumption was associated with higher body mass index, occupational physical activity and cigarette smoking, and lower blood pressure, education level, leisure time physical activity, tea consumption and alcohol use. Age, body mass index, systolic blood pressure and current smoking were positively associated with the risk of type 2 diabetes, however, education, and occupational, commuting and leisure time physical activity were inversely associated. The significant inverse association between coffee consumption and the risk of type 2 diabetes was found in both sexes but the association was stronger in women. Coffee consumption was significantly and inversely associated with fasting glucose, 2-hour plasma glucose, fasting insulin, impaired fasting glucose, impaired glucose regulation, and hyperinsulinemia among both men and women and with isolated impaired glucose tolerance among women. Serum gamma-glutamyltransferase modified the association between coffee consumption and incident diabetes. Among subjects with high serum -glutamyltransferase (>75th percentile), coffee consumption showed an inverse association for women, as well as men and women combined. An inverse association also occurred between coffee consumption and the risk of total, cardiovascular disease, and coronary heart disease mortality among patients with type 2 diabetes. The results of this study showed that habitual coffee consumption may be associated with a reduced risk of type 2 diabetes. Coffee consumption may have some effects on several markers of glycemia, and may lower the incident of type 2 diabetes in high normal serum -glutamyltransferase levels. Total, cardiovascular disease, and coronary heart disease mortality rate among subjects with type 2 diabetes may also be reduced by coffee consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Type 2 diabetes is an increasing, serious, and costly public health problem. The increase in the prevalence of the disease can mainly be attributed to changing lifestyles leading to physical inactivity, overweight, and obesity. These lifestyle-related risk factors offer also a possibility for preventive interventions. Until recently, proper evidence regarding the prevention of type 2 diabetes has been virtually missing. To be cost-effective, intensive interventions to prevent type 2 diabetes should be directed to people at an increased risk of the disease. The aim of this series of studies was to investigate whether type 2 diabetes can be prevented by lifestyle intervention in high-risk individuals, and to develop a practical method to identify individuals who are at high risk of type 2 diabetes and would benefit from such an intervention. To study the effect of lifestyle intervention on diabetes risk, we recruited 522 volunteer, middle-aged (aged 40 - 64 at baseline), overweight (body mass index > 25 kg/m2) men (n = 172) and women (n = 350) with impaired glucose tolerance to the Diabetes Prevention Study (DPS). The participants were randomly allocated either to the intensive lifestyle intervention group or the control group. The control group received general dietary and exercise advice at baseline, and had annual physician's examination. The participants in the intervention group received, in addition, individualised dietary counselling by a nutritionist. They were also offered circuit-type resistance training sessions and were advised to increase overall physical activity. The intervention goals were to reduce body weight (5% or more reduction from baseline weight), limit dietary fat (< 30% of total energy consumed) and saturated fat (< 10% of total energy consumed), and to increase dietary fibre intake (15 g / 1000 kcal or more) and physical activity (≥ 30 minutes/day). Diabetes status was assessed annually by a repeated 75 g oral glucose tolerance testing. First analysis on end-points was completed after a mean follow-up of 3.2 years, and the intervention phase was terminated after a mean duration of 3.9 years. After that, the study participants continued to visit the study clinics for the annual examinations, for a mean of 3 years. The intervention group showed significantly greater improvement in each intervention goal. After 1 and 3 years, mean weight reductions were 4.5 and 3.5 kg in the intervention group and 1.0 kg and 0.9 kg in the control group. Cardiovascular risk factors improved more in the intervention group. After a mean follow-up of 3.2 years, the risk of diabetes was reduced by 58% in the intervention group compared with the control group. The reduction in the incidence of diabetes was directly associated with achieved lifestyle goals. Furthermore, those who consumed moderate-fat, high-fibre diet achieved the largest weight reduction and, even after adjustment for weight reduction, the lowest diabetes risk during the intervention period. After discontinuation of the counselling, the differences in lifestyle variables between the groups still remained favourable for the intervention group. During the post-intervention follow-up period of 3 years, the risk of diabetes was still 36% lower among the former intervention group participants, compared with the former control group participants. To develop a simple screening tool to identify individuals who are at high risk of type 2 diabetes, follow-up data of two population-based cohorts of 35-64 year old men and women was used. The National FINRISK Study 1987 cohort (model development data) included 4435 subjects, with 182 new drug-treated cases of diabetes identified during ten years, and the FINRISK Study 1992 cohort (model validation data) included 4615 subjects, with 67 new cases of drug-treated diabetes during five years, ascertained using the Social Insurance Institution's Drug register. Baseline age, body mass index, waist circumference, history of antihypertensive drug treatment and high blood glucose, physical activity and daily consumption of fruits, berries or vegetables were selected into the risk score as categorical variables. In the 1987 cohort the optimal cut-off point of the risk score identified 78% of those who got diabetes during the follow-up (= sensitivity of the test) and 77% of those who remained free of diabetes (= specificity of the test). In the 1992 cohort the risk score performed equally well. The final Finnish Diabetes Risk Score (FINDRISC) form includes, in addition to the predictors of the model, a question about family history of diabetes and the age category of over 64 years. When applied to the DPS population, the baseline FINDRISC value was associated with diabetes risk among the control group participants only, indicating that the intensive lifestyle intervention given to the intervention group participants abolished the diabetes risk associated with baseline risk factors. In conclusion, the intensive lifestyle intervention produced long-term beneficial changes in diet, physical activity, body weight, and cardiovascular risk factors, and reduced diabetes risk. Furthermore, the effects of the intervention were sustained after the intervention was discontinued. The FINDRISC proved to be a simple, fast, inexpensive, non-invasive, and reliable tool to identify individuals at high risk of type 2 diabetes. The use of FINDRISC to identify high-risk subjects, followed by lifestyle intervention, provides a feasible scheme in preventing type 2 diabetes, which could be implemented in the primary health care system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study addresses three important issues in tree bucking optimization in the context of cut-to-length harvesting. (1) Would the fit between the log demand and log output distributions be better if the price and/or demand matrices controlling the bucking decisions on modern cut-to-length harvesters were adjusted to the unique conditions of each individual stand? (2) In what ways can we generate stand and product specific price and demand matrices? (3) What alternatives do we have to measure the fit between the log demand and log output distributions, and what would be an ideal goodness-of-fit measure? Three iterative search systems were developed for seeking stand-specific price and demand matrix sets: (1) A fuzzy logic control system for calibrating the price matrix of one log product for one stand at a time (the stand-level one-product approach); (2) a genetic algorithm system for adjusting the price matrices of one log product in parallel for several stands (the forest-level one-product approach); and (3) a genetic algorithm system for dividing the overall demand matrix of each of the several log products into stand-specific sub-demands simultaneously for several stands and products (the forest-level multi-product approach). The stem material used for testing the performance of the stand-specific price and demand matrices against that of the reference matrices was comprised of 9 155 Norway spruce (Picea abies (L.) Karst.) sawlog stems gathered by harvesters from 15 mature spruce-dominated stands in southern Finland. The reference price and demand matrices were either direct copies or slightly modified versions of those used by two Finnish sawmilling companies. Two types of stand-specific bucking matrices were compiled for each log product. One was from the harvester-collected stem profiles and the other was from the pre-harvest inventory data. Four goodness-of-fit measures were analyzed for their appropriateness in determining the similarity between the log demand and log output distributions: (1) the apportionment degree (index), (2) the chi-square statistic, (3) Laspeyres quantity index, and (4) the price-weighted apportionment degree. The study confirmed that any improvement in the fit between the log demand and log output distributions can only be realized at the expense of log volumes produced. Stand-level pre-control of price matrices was found to be advantageous, provided the control is done with perfect stem data. Forest-level pre-control of price matrices resulted in no improvement in the cumulative apportionment degree. Cutting stands under the control of stand-specific demand matrices yielded a better total fit between the demand and output matrices at the forest level than was obtained by cutting each stand with non-stand-specific reference matrices. The theoretical and experimental analyses suggest that none of the three alternative goodness-of-fit measures clearly outperforms the traditional apportionment degree measure. Keywords: harvesting, tree bucking optimization, simulation, fuzzy control, genetic algorithms, goodness-of-fit

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Standards have been placed to regulate the microbial and preservative contents to assure that foods are safe to the consumer. In a case of a food-related disease outbreak, it is crucial to be able to detect and identify quickly and accurately the cause of the disease. In addition, for every day control of food microbial and preservative contents, the detection methods must be easily performed for numerous food samples. In this present study, quicker alternative methods were studied for identification of bacteria by DNA fingerprinting. A flow cytometry method was developed as an alternative to pulsed-field gel electrophoresis, the golden method . DNA fragment sizing by an ultrasensitive flow cytometer was able to discriminate species and strains in a reproducible and comparable manner to pulsed-field gel electrophoresis. This new method was hundreds times faster and 200,000 times more sensitive. Additionally, another DNA fingerprinting identification method was developed based on single-enzyme amplified fragment length polymorphism (SE-AFLP). This method allowed the differentiation of genera, species, and strains of pathogenic bacteria of Bacilli, Staphylococci, Yersinia, and Escherichia coli. These fingerprinting patterns obtained by SE-AFLP were simpler and easier to analyze than those by the traditional amplified fragment length polymorphism by double enzyme digestion. Nisin (E234) is added as a preservative to different types of foods, especially dairy products, around the world. Various detection methods exist for nisin, but they lack in sensitivity, speed or specificity. In this present study, a sensitive nisin-induced green fluorescent protein (GFPuv) bioassay was developed using the Lactococcus lactis two-component signal system NisRK and the nisin-inducible nisA promoter. The bioassay was extremely sensitive with detection limit of 10 pg/ml in culture supernatant. In addition, it was compatible for quantification from various food matrices, such as milk, salad dressings, processed cheese, liquid eggs, and canned tomatoes. Wine has good antimicrobial properties due to its alcohol concentration, low pH, and organic content and therefore often assumed to be microbially safe to consume. Another aim of this thesis was to study the microbiota of wines returned by customers complaining of food-poisoning symptoms. By partial 16S rRNA gene sequence analysis, ribotyping, and boar spermatozoa motility assay, it was identified that one of the wines contained a Bacillus simplex BAC91, which produced a heat-stable substance toxic to the mitochondria of sperm cells. The antibacterial activity of wine was tested on the vegetative cells and spores of B. simplex BAC91, B. cereus type strain ATCC 14579 and cereulide-producing B. cereus F4810/72. Although the vegetative cells and spores of B. simplex BAC91 were sensitive to the antimicrobial effects of wine, the spores of B. cereus strains ATCC 14579 and F4810/72 stayed viable for at least 4 months. According to these results, Bacillus spp., more specifically spores, can be a possible risk to the wine consumer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite of improving levels of hygiene, the incidence of registered food borne disease has been at the same level for many years: there were 40 to 90 epidemics in which 1000-9000 persons contracted food poisoning through food or drinking water in Finland. Until the year 2004 salmonella and campylobacter were the most common bacterial causes of food borne diseases, but in years 2005-2006 Bacillus cereus was the most common. Similar developement has been published i.e. in Germany already in the 1990´s. One reason for this can be Bacillus cereus and its emetic toxin, cereulide. Bacillus cereus is a common environmental bacterium that contaminates raw materials of food. Otherwise than salmonella and campylobacter, Bacillus cereus is a heat resistant bacterium, capable of surviving most cooking procedures due to the production of highly thermo resistant spores. The food involved has usually been heat treated and surviving spores are the source of the food poisoning. The heat treatment induces germination of the spore and the vegetative cells then produce toxins. This doctoral thesis research focuses on developing methods for assessing and eliminating risks to food safety by cereulide producing Bacillus cereus. The biochemistry and physiology of cereulide production was investigated and the results were targeted to offer tools for minimizing toxin risk in food during the production. I developed methods for the extraction and quantitative analysis of cereulide directly from food. A prerequisite for that is knowledge of the chemical and physical properties of the toxin. Because cereulide is practically insoluble in water, I used organic solvents; methanol, ethanol and pentane for the extraction. For extraction of bakery products I used high temperature (100C) and pressure (103.4 bars). Alternaties for effective extraction is to flood the plain food with ethanol, followed by stationary equilibration at room temperature. I used this protocol for extracting cereulide from potato puree and penne. Using this extraction method it is also possible also extract cereulide from liquid food, like milk. These extraction methods are important improvement steps for studying of Bacillus cereus emetic food poisonings. Prior my work, cereulide extraction was done using water. As the result, the yield was poor and variable. To investigate suspected food poisonings, it is important to show actual toxicity of the incriminated food. Many toxins, but not cereulide, inactivate during food processing like heating. The next step is to identify toxin by chemical methods. I developed with my colleague Maria Andesson a rapid assay for the detection of cereulide toxicity, within 5 to 15 minutes. By applying this test it is possible to rapidly detect which food was causing the food poisoning. The chemical identification of cereulide was achieved using mass spectrometry. I used cereulide specific molecular ions, m/z (+/-0.3) 1153.8 (M+H+), 1171.0 (M+NH4+), 1176.0 (M+Na+) and 1191.7 (M+K+) for reliable identification. I investigated foods to find out their amenability to accumulate cereulide. Cereulide was formed high amounts (0.3 to 5.5 microg/g wet wt) when of cereulide producing B. cereus strains were present in beans, rice, rice-pastry and meat-pastry, if stored at non refrigerated temperatures (21-23C). Rice and meat pastries are frequently consumed under conditions where no cooled storage is available e.g. picnics and outdoor events. Bacillus cereus is a ubiquitous spore former and is therefore difficult to eliminate from foods. It is therefore important to know which conditions will affect the formation of cereulide in foods. My research showed that the cereulide content was strongly (10 to 1000 fold differences in toxin content) affected by the growth environment of the bacterium. Storage of foods under nitrogen atmosphere (> 99.5 %) prevented the production of cereulide. But when also carbon dioxide was present, minimizing the oxygen contant (< 1%) did not protect the food from formation of cereulide in preliminary experiments. Also food supplements affected cereulide production at least in the laboratory. Adding free amino acids, leucine and valine, stimulated cereulide production 10 to 20 fold. In peptide bonded form these amino acids are natural constituents in all proteins. Interestingly, adding peptide bonded leucine and valine had no significant effect on cereulide production. Free amino acids leucine and valine are approved food supplements and widely used as flawour modifiers in food technology. My research showed that these food supplements may increase food poisoning risk even though they are not toxic themselves.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rhizoctonia spp. are ubiquitous soil inhabiting fungi that enter into pathogenic or symbiotic associations with plants. In general Rhizoctonia spp. are regarded as plant pathogenic fungi and many cause root rot and other plant diseases which results in considerable economic losses both in agriculture and forestry. Many Rhizoctonia strains enter into symbiotic mycorrhizal associations with orchids and some hypovirulent strains are promising biocontrol candidates in preventing host plant infection by pathogenic Rhizoctonia strains. This work focuses on uni- and binucleate Rhizoctonia (respectively UNR and BNR) strains belonging to the teleomorphic genus Ceratobasidium, but multinucleate Rhizoctonia (MNR) belonging to teleomorphic genus Thanatephorus and ectomycorrhizal fungal species, such as Suillus bovinus, were also included in DNA probe development work. Strain specific probes were developed to target rDNA ITS (internal transcribed spacer) sequences (ITS1, 5.8S and ITS2) and applied in Southern dot blot and liquid hybridization assays. Liquid hybridization was more sensitive and the size of the hybridized PCR products could be detected simultaneously, but the advantage in Southern hybridization was that sample DNA could be used without additional PCR amplification. The impacts of four Finnish BNR Ceratorhiza sp. strains 251, 266, 268 and 269 were investigated on Scot pine (Pinus sylvestris) seedling growth, and the infection biology and infection levels were microscopically examined following tryphan blue staining of infected roots. All BNR strains enhanced early seedling growth and affected the root architecture, while the infection levels remained low. The fungal infection was restricted to the outer cortical regions of long roots and typical monilioid cells detected with strain 268. The interactions of pathogenic UNR Ceratobasidium bicorne strain 1983-111/1N, and endophytic BNR Ceratorhiza sp. strain 268 were studied in single or dual inoculated Scots pine roots. The fungal infection levels and host defence-gene activity of nine transcripts [phenylalanine ammonia lyase (pal1), silbene synthase (STS), chalcone synthase (CHS), short-root specific peroxidase (Psyp1), antimicrobial peptide gene (Sp-AMP), rapidly elicited defence-related gene (PsACRE), germin-like protein (PsGER1), CuZn- superoxide dismutase (SOD), and dehydrin-like protein (dhy-like)] were measured from differentially treated and un-treated control roots by quantitative real time PCR (qRT-PCR). The infection level of pathogenic UNR was restricted in BNR- pre-inoculated Scots pine roots, while UNR was more competitive in simultaneous dual infection. The STS transcript was highly up-regulated in all treated roots, while CHS, pal1, and Psyp1 transcripts were more moderately activated. No significant activity of Sp-AMP, PsACRE, PsGER1, SOD, or dhy-like transcripts were detected compared to control roots. The integrated experiments presented, provide tools to assist in the future detection of these fungi in the environment and to understand the host infection biology and defence, and relationships between these interacting fungi in roots and soils. This study further confirms the complexity of the Rhizoctonia group both phylogenetically and in their infection biology and plant host specificity. The knowledge obtained could be applied in integrated forestry nursery management programmes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Defence against pathogens is a vital need of all living organisms that has led to the evolution of complex immune mechanisms. However, although immunocompetence the ability to resist pathogens and control infection has in recent decades become a focus for research in evolutionary ecology, the variation in immune function observed in natural populations is relatively little understood. This thesis examines sources of this variation (environmental, genetic and maternal effects) during the nestling stage and its fitness consequences in wild populations of passerines: the blue tit (Cyanistes caeruleus) and the collared flycatcher (Ficedula albicollis). A developing organism may face a dilemma as to whether to allocate limited resources to growth or to immune defences. The optimal level of investment in immunity is shaped inherently by specific requirements of the environment. If the probability of contracting infection is low, maintaining high growth rates even at the expense of immune function may be advantageous for nestlings, as body mass is usually a good predictor of post-fledging survival. In experiments with blue tits and haematophagous hen fleas (Ceratophyllus gallinae) using two methods, methionine supplementation (to manipulate nestlings resource allocation to cellular immune function) and food supplementation (to increase resource availability), I confirmed that there is a trade-off between growth and immunity and that the abundance of ectoparasites is an environmental factor affecting allocation of resources to immune function. A cross-fostering experiment also revealed that environmental heterogeneity in terms of abundance of ectoparasites may contribute to maintaining additive genetic variation in immunity and other traits. Animal model analysis of extensive data collected from the population of collared flycatchers on Gotland (Sweden) allowed examination of the narrow-sense heritability of PHA-response the most commonly used index of cellular immunocompetence in avian studies. PHA-response is not heritable in this population, but is subject to a non-heritable origin (presumably maternal) effect. However, experimental manipulation of yolk androgen levels indicates that the mechanism of the maternal effect in PHA-response is not in ovo deposition of androgens. The relationship between PHA-response and recruitment was studied for over 1300 collared flycatcher nestlings. Multivariate selection analysis shows that it is body mass, not PHA-response, that is under direct selection. PHA-response appears to be related to recruitment because of its positive relationship with body mass. These results imply that either PHA-response fails to capture the immune mechanisms that are relevant for defence against pathogens encountered by fledglings or that the selection pressure from parasites is not as strong as commonly assumed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ongoing rapid fragmentation of tropical forests is a major threat to global biodiversity. This is because many of the tropical forests are so-called biodiversity 'hotspots', areas that host exceptional species richness and concentrations of endemic species. Forest fragmentation has negative ecological and genetic consequences for plant survival. Proposed reasons for plant species' loss in forest fragments are, e.g., abiotic edge effects, altered species interactions, increased genetic drift, and inbreeding depression. To be able to conserve plants in forest fragments, the ecological and genetic processes that threaten the species have to be understood. That is possible only after obtaining adequate information on their biology, including taxonomy, life history, reproduction, and spatial and genetic structure of the populations. In this research, I focused on the African violet (genus Saintpaulia), a little-studied conservation flagship from the Eastern Arc Mountains and Coastal Forests hotspot of Tanzania and Kenya. The main objective of the research was to increase understanding of the life history, ecology and population genetics of Saintpaulia that is needed for the design of appropriate conservation measures. A further aim was to provide population-level insights into the difficult taxonomy of Saintpaulia. Ecological field work was conducted in a relatively little fragmented protected forest in the Amani Nature Reserve in the East Usambara Mountains, in northeastern Tanzania, complemented by population genetic laboratory work and ecological experiments in Helsinki, Finland. All components of the research were conducted with Saintpaulia ionantha ssp. grotei, which forms a taxonomically controversial population complex in the study area. My results suggest that Saintpaulia has good reproductive performance in forests with low disturbance levels in the East Usambara Mountains. Another important finding was that seed production depends on sufficient pollinator service. The availability of pollinators should thus be considered in the in situ management of threatened populations. Dynamic population stage structures were observed suggesting that the studied populations are demographically viable. High mortality of seedlings and juveniles was observed during the dry season but this was compensated by ample recruitment of new seedlings after the rainy season. Reduced tree canopy closure and substrate quality are likely to exacerbate seedling and juvenile mortality, and, therefore, forest fragmentation and disturbance are serious threats to the regeneration of Saintpaulia. Restoration of sufficient shade to enhance seedling establishment is an important conservation measure in populations located in disturbed habitats. Long-term demographic monitoring, which enables the forecasting of a population s future, is also recommended in disturbed habitats. High genetic diversities were observed in the populations, which suggest that they possess the variation that is needed for evolutionary responses in a changing environment. Thus, genetic management of the studied populations does not seem necessary as long as the habitats remain favourable for Saintpaulia. The observed high levels of inbreeding in some of the populations, and the reduced fitness of the inbred progeny compared to the outbred progeny, as revealed by the hand-pollination experiment, indicate that inbreeding and inbreeding depression are potential mechanisms contributing to the extinction of Saintpaulia populations. The relatively weak genetic divergence of the three different morphotypes of Saintpaulia ionantha ssp. grotei lend support to the hypothesis that the populations in the Usambara/lowlands region represent a segregating metapopulation (or metapopulations), where subpopulations are adapting to their particular environments. The partial genetic and phenological integrity, and the distinct trailing habit of the morphotype 'grotei' would, however, justify its placement in a taxonomic rank of its own, perhaps in a subspecific rank.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Breast reconstruction is performed for 10-15 % of women operated on for breast cancer. A popular method is the TRAM (transverse rectus abdominis musculocutaneous) flap formed of the patient’s own abdominal tissue, a part of one of the rectus abdominis muscles and a transverse skin-subcutis area over it. The flap can be raised as a pedicled or a free flap. The pedicled TRAM flap, based on its nondominant pedicle superior epigastric artery (SEA), is rotated to the chest so that blood flow through SEA continues. The free TRAM flap, based on its dominant pedicle deep inferior epigastric artery (DIEA), is detached from the abdomen, transferred to the chest, and DIEA and vein are anastomosed to vessels on the chest. Cutaneous necrosis is seen in 5–60 % of pedicled TRAM flaps and in 0–15 % of free TRAM flaps. This study was the first one to show with blood flow measurements that the cutaneous blood flow is more generous in free than in pedicled TRAM flaps. After this study the free TRAM flap has exceeded the pedicled flap in popularity as a breast reconstruction method, although the free flap it is technically a more demanding procedure than the pedicled TRAM flap. In pedicled flaps, a decrease in cutaneous blood flow was observed when DIEA was ligated. It seems that SEA cannot provide sufficient blood flow on the first postoperative days. The postoperative cutaneous blood flow in free TRAM flaps was more stable than in pedicled flaps. Development of cutaneous necrosis of pedicled TRAM flaps could be predicted based on intraoperative laser Doppler flowmetry (LDF) measurements. The LDF value on the contralateral skin of the flap decreased to 43 ± 7 % of the initial value after ligation of the DIEA in flaps developing cutaneous necrosis during the next week. Endothelin-1 (ET-1) is a powerful vasoconstrictory peptide secreted by vascular endothelial cells. A correlation was found between plasma ET-1 concentrations and peripheral vasoconstriction developing during and after breast reconstructions with a pedicled TRAM flap. ET-1 was not associated with the development of cutaneous necrosis. Felodipine, a vasodilating calcium channel antagonist, had no effect on plasma ET-1 concentrations, peripheral vasoconstriction or development of cutaneous necrosis in free TRAM flaps. Body mass index and thickness of abdominal were not associated with cutaneous necrosis in pedicled TRAM flaps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Septic shock is a common killer in intensive care units (ICU). The most crucial issue concerning the outcome is the early and aggressive start of treatment aimed at normalization of hemodynamics and the early start of antibiotics during the very first hours. The optimal targets of hemodynamic treatment, or impact of hemodynamic treatment on survival after first resuscitation period are less known. The objective of this study was to evaluate different aspects of the hemodynamic pattern in septic shock with special attention to prediction of outcome. In particular components of early treatment and monitoring in the ICU were assessed. A total of 401 patients, 218 with septic shock and 192 with severe sepsis or septic shock were included in the study. The patients were treated in 24 Finnish ICUs during 1999-2005. 295 of the patients were included in the Finnish national epidemiologic Finnsepsis study. We found that the most important hemodynamic variables concerning the outcome were the mean arterial pressures (MAP) and lactate during the first six hours in ICU and the MAP and mixed venous oxygen saturation (SvO2) under 70% during first 48 hours. The MAP levels under 65 mmHg and SvO2 below 70% were the best predictive thresholds. Also the high central venous pressure (CVP) correlated to adverse outcome. We assessed the correlation and agreement of SvO2 and mean central venous oxygen saturation (ScvO2) in septic shock during first day in ICU. The mean SvO2 was below ScvO2 during early sepsis. Bias of difference was 4.2% (95% limits of agreement 8.1% to 16.5%) by Bland-Altman analysis. The difference between saturation values correlated significantly to cardiac index and oxygen delivery. Thus, the ScvO2 can not be used as a substitute of SvO2 in hemodynamic monitoring in ICU. Several biomarkers have been investigated for their ability to help in diagnosis or outcome prediction in sepsis. We assessed the predictive value of N-terminal pro brain natriuretic peptide (NT-proBNP) on mortality in severe sepsis or septic shock. The NT-proBNP levels were significantly higher in hospital nonsurvivors. The NT-proBNP 72 hrs after inclusion was independent predictor of hospital mortality. The acute cardiac load contributed to NTproBNP values at admission, but renal failure was the main confounding factor later. The accuracy of NT-proBNP, however, was not sufficient for clinical decision-making concerning the outcome prediction. The delays in start of treatment are associated to poorer prognosis in sepsis. We assessed how the early treatment guidelines were adopted, and what was the impact of early treatment on mortality in septic shock in Finland. We found that the early treatment was not optimal in Finnish hospitals and this reflected to mortality. A delayed initiation of antimicrobial agents was especially associated with unfavorable outcome.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The adequacy of anesthesia has been studied since the introduction of balanced general anesthesia. Commercial monitors based on electroencephalographic (EEG) signal analysis have been available for monitoring the hypnotic component of anesthesia from the beginning of the 1990s. Monitors measuring the depth of anesthesia assess the cortical function of the brain, and have gained acceptance during surgical anesthesia with most of the anesthetic agents used. However, due to frequent artifacts, they are considered unsuitable for monitoring consciousness in intensive care patients. The assessment of analgesia is one of the cornerstones of general anesthesia. Prolonged surgical stress may lead to increased morbidity and delayed postoperative recovery. However, no validated monitoring method is currently available for evaluating analgesia during general anesthesia. Awareness during anesthesia is caused by an inadequate level of hypnosis. This rare but severe complication of general anesthesia may lead to marked emotional stress and possibly posttraumatic stress disorder. In the present series of studies, the incidence of awareness and recall during outpatient anesthesia was evaluated and compared with that of in inpatient anesthesia. A total of 1500 outpatients and 2343 inpatients underwent a structured interview. Clear intraoperative recollections were rare the incidence being 0.07% in outpatients and 0.13% in inpatients. No significant differences emerged between outpatients and inpatients. However, significantly smaller doses of sevoflurane were administered to outpatients with awareness than those without recollections (p<0.05). EEG artifacts in 16 brain-dead organ donors were evaluated during organ harvest surgery in a prospective, open, nonselective study. The source of the frontotemporal biosignals in brain-dead subjects was studied, and the resistance of bispectral index (BIS) and Entropy to the signal artifacts was compared. The hypothesis was that in brain-dead subjects, most of the biosignals recorded from the forehead would consist of artifacts. The original EEG was recorded and State Entropy (SE), Response Entropy (RE), and BIS were calculated and monitored during solid organ harvest. SE differed from zero (inactive EEG) in 28%, RE in 29%, and BIS in 68% of the total recording time (p<0.0001 for all). The median values during the operation were SE 0.0, RE 0.0, and BIS 3.0. In four of the 16 organ donors, EEG was not inactive, and unphysiologically distributed, nonreactive rhythmic theta activity was present in the original EEG signal. After the results from subjects with persistent residual EEG activity were excluded, SE, RE, and BIS differed from zero in 17%, 18%, and 62% of the recorded time, respectively (p<0.0001 for all). Due to various artifacts, the highest readings in all indices were recorded without neuromuscular blockade. The main sources of artifacts were electrocauterization, electromyography (EMG), 50-Hz artifact, handling of the donor, ballistocardiography, and electrocardiography. In a prospective, randomized study of 26 patients, the ability of Surgical Stress Index (SSI) to differentiate patients with two clinically different analgesic levels during shoulder surgery was evaluated. SSI values were lower in patients with an interscalene brachial plexus block than in patients without an additional plexus block. In all patients, anesthesia was maintained with desflurane, the concentration of which was targeted to maintain SE at 50. Increased blood pressure or heart rate (HR), movement, and coughing were considered signs of intraoperative nociception and treated with alfentanil. Photoplethysmographic waveforms were collected from the contralateral arm to the operated side, and SSI was calculated offline. Two minutes after skin incision, SSI was not increased in the brachial plexus block group and was lower (38 ± 13) than in the control group (58 ± 13, p<0.005). Among the controls, one minute prior to alfentanil administration, SSI value was higher than during periods of adequate antinociception, 59 ± 11 vs. 39 ± 12 (p<0.01). The total cumulative need for alfentanil was higher in controls (2.7 ± 1.2 mg) than in the brachial plexus block group (1.6 ± 0.5 mg, p=0.008). Tetanic stimulation to the ulnar region of the hand increased SSI significantly only among patients with a brachial plexus block not covering the site of stimulation. Prognostic value of EEG-derived indices was evaluated and compared with Transcranial Doppler Ultrasonography (TCD), serum neuron-specific enolase (NSE) and S-100B after cardiac arrest. Thirty patients resuscitated from out-of-hospital arrest and treated with induced mild hypothermia for 24 h were included. Original EEG signal was recorded, and burst suppression ratio (BSR), RE, SE, and wavelet subband entropy (WSE) were calculated. Neurological outcome during the six-month period after arrest was assessed with the Glasgow-Pittsburgh Cerebral Performance Categories (CPC). Twenty patients had a CPC of 1-2, one patient had a CPC of 3, and nine patients died (CPC 5). BSR, RE, and SE differed between good (CPC 1-2) and poor (CPC 3-5) outcome groups (p=0.011, p=0.011, p=0.008, respectively) during the first 24 h after arrest. WSE was borderline higher in the good outcome group between 24 and 48 h after arrest (p=0.050). All patients with status epilepticus died, and their WSE values were lower (p=0.022). S-100B was lower in the good outcome group upon arrival at the intensive care unit (p=0.010). After hypothermia treatment, NSE and S-100B values were lower (p=0.002 for both) in the good outcome group. The pulsatile index was also lower in the good outcome group (p=0.004). In conclusion, the incidence of awareness in outpatient anesthesia did not differ from that in inpatient anesthesia. Outpatients are not at increased risk for intraoperative awareness relative to inpatients undergoing general anesthesia. SE, RE, and BIS showed non-zero values that normally indicate cortical neuronal function, but were in these subjects mostly due to artifacts after clinical brain death diagnosis. Entropy was more resistant to artifacts than BIS. During general anesthesia and surgery, SSI values were lower in patients with interscalene brachial plexus block covering the sites of nociceptive stimuli. In detecting nociceptive stimuli, SSI performed better than HR, blood pressure, or RE. BSR, RE, and SE differed between the good and poor neurological outcome groups during the first 24 h after cardiac arrest, and they may be an aid in differentiating patients with good neurological outcomes from those with poor outcomes after out-of-hospital cardiac arrest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The incidence of all forms of congenital heart defects is 0.75%. For patients with congenital heart defects, life-expectancy has improved with new treatment modalities. Structural heart defects may require surgical or catheter treatment which may be corrective or palliative. Even those with corrective therapy need regular follow-up due to residual lesions, late sequelae, and possible complications after interventions. Aims: The aim of this thesis was to evaluate cardiac function before and after treatment for volume overload of the right ventricle (RV) caused by atrial septal defect (ASD), volume overload of the left ventricle (LV) caused by patent ductus arteriosus (PDA), and pressure overload of the LV caused by coarctation of the aorta (CoA), and to evaluate cardiac function in patients with Mulibrey nanism. Methods: In Study I, of the 24 children with ASD, 7 underwent surgical correction and 17 percutaneous occlusion of ASD. Study II had 33 patients with PDA undergoing percutaneous occlusion. In Study III, 28 patients with CoA underwent either surgical correction or percutaneous balloon dilatation of CoA. Study IV comprised 26 children with Mulibrey nanism. A total of 76 healthy voluntary children were examined as a control group. In each study, controls were matched to patients. All patients and controls underwent clinical cardiovascular examinations, two-dimensional (2D) and three-dimensional (3D) echocardiographic examinations, and blood sampling for measurement of natriuretic peptides prior to the intervention and twice or three times thereafter. Control children were examined once by 2D and 3D echocardiography. M-mode echocardiography was performed from the parasternal long axis view directed by 2D echocardiography. The left atrium-to-aorta (LA/Ao) ratio was calculated as an index of LA size. The end-diastolic and end-systolic dimensions of LV as well as the end-diastolic thicknesses of the interventricular septum and LV posterior wall were measured. LV volumes, and the fractional shortening (FS) and ejection fraction (EF) as indices of contractility were then calculated, and the z scores of LV dimensions determined. Diastolic function of LV was estimated from the mitral inflow signal obtained by Doppler echocardiography. In three-dimensional echocardiography, time-volume curves were used to determine end-diastolic and end-systolic volumes, stroke volume, and EF. Diastolic and systolic function of LV was estimated from the calculated first derivatives of these curves. Results: (I): In all children with ASD, during the one-year follow-up, the z score of the RV end-diastolic diameter decreased and that of LV increased. However, dilatation of RV did not resolve entirely during the follow-up in either treatment group. In addition, the size of LV increased more slowly in the surgical subgroup but reached control levels in both groups. Concentrations of natriuretic peptides in patients treated percutaneously increased during the first month after ASD closure and normalized thereafter, but in patients treated surgically, they remained higher than in controls. (II): In the PDA group, at baseline, the end-diastolic diameter of LV measured over 2SD in 5 of 33 patients. The median N-terminal pro-brain natriuretic peptide (proBNP) concentration before closure measured 72 ng/l in the control group and 141 ng/l in the PDA group (P = 0.001) and 6 months after closure measured 78.5 ng/l (P = NS). Patients differed from control subjects in indices of LV diastolic and systolic function at baseline, but by the end of follow-up, all these differences had disappeared. Even in the subgroup of patients with normal-sized LV at baseline, the LV end-diastolic volume decreased significantly during follow-up. (III): Before repair, the size and wall thickness of LV were higher in patients with CoA than in controls. Systolic blood pressure measured a median 123 mm Hg in patients before repair (P < 0.001) and 103 mm Hg one year thereafter, and 101 mm Hg in controls. The diameter of the coarctation segment measured a median 3.0 mm at baseline, and 7.9 at the 12-month (P = 0.006) follow-up. Thicknesses of the interventricular septum and posterior wall of the LV decreased after repair but increased to the initial level one year thereafter. The velocity time integrals of mitral inflow increased, but no changes were evident in LV dimensions or contractility. During follow-up, serum levels of natriuretic peptides decreased correlating with diastolic and systolic indices of LV function in 2D and 3D echocardiography. (IV): In 2D echocardiography, the interventricular septum and LV posterior wall were thicker, and velocity time integrals of mitral inflow shorter in patients with Mulibrey nanism than in controls. In 3D echocardiography, LV end-diastolic volume measured a median 51.9 (range 33.3 to 73.4) ml/m² in patients and 59.7 (range 37.6 to 87.6) ml/m² in controls (P = 0.040), and serum levels of ANPN and proBNP a median 0.54 (range 0.04 to 4.7) nmol/l and 289 (range 18 to 9170) ng/l, in patients and 0.28 (range 0.09 to 0.72) nmol/l (P < 0.001) and 54 (range 26 to 139) ng/l (P < 0.001) in controls. They correlated with several indices of diastolic LV function. Conclusions (I): During the one-year follow-up after the ASD closure, RV size decreased but did not normalize in all patients. The size of the LV normalized after ASD closure but the increase in LV size was slower in patients treated surgically than in those treated with the percutaneous technique. Serum levels of ANPN and proBNP were elevated prior to ASD closure but decreased thereafter to control levels in patients treated with the percutaneous technique but not in those treated surgically. (II): Changes in LV volume and function caused by PDA disappeared by 6 months after percutaneous closure. Even the children with normal-sized LV benefited from the procedure. (III): After repair of CoA, the RV size and the velocity time integrals of mitral inflow increased, and serum levels of natriuretic peptides decreased. Patients need close follow-up, despite cessation of LV pressure overload, since LV hypertrophy persisted even in normotensive patients with normal growth of the coarctation segment. (IV): In children with Mulibrey nanism, the LV wall was hypertrophied, with myocardial restriction and impairment of LV function. Significant correlations appeared between indices of LV function, size of the left atrium, and levels of natriuretic peptides, indicating that measurement of serum levels of natriuretic peptides can be used in the clinical follow-up of this patient group despite its dependence on loading conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. Kidney transplantation (KTX) is considered to be the best treatment of terminal uremia. Despite improvements in short-term graft survival, a considerable number of kidney allografts are lost due to the premature death of patients with a functional kidney and to chronic allograft nephropathy (CAN). Aim. To investigate the risk factors involved in the progression of CAN and to analyze diagnostic methods for this entity. Materials and methods. Altogether, 153 implant and 364 protocol biopsies obtained between June 1996 and April 2008 were analyzed. The biopsies were classified according to Banff ’97 and chronic allograft damage index (CADI). Immunohistochemistry for TGF-β1 was performed in 49 biopsies. Kidney function was evaluated by creatinine and/or cystatin C measurement and by various estimates of glomerular filtration rate (GFR). Demographic data of the donors and recipients were recorded after 2 years’ follow-up. Results. Most of the 3-month biopsies (73%) were nearly normal. The mean CADI score in the 6-month biopsies decreased significantly after 2001. Diastolic hypertension correlated with ΔCADI. Serum creatinine concentration at hospital discharge and glomerulosclerosis were risk factors for ΔCADI. High total and LDL cholesterol, low HDL and hypertension correlated with chronic histological changes. The mean age of the donors increased from 41 -52 years. Older donors were more often women who had died from an underlying disease. The prevalence of delayed graft function increased over the years, while acute rejections (AR) decreased significantly over the years. Sub-clinical AR was observed in 4% and it did not affect long-term allograft function or CADI. Recipients´ drug treatment was modified along the Studies, being mycophenolate mophetil, tacrolimus, statins and blockers of the renine-angiotensin-system more frequently prescribed after 2001. Patients with a higher ΔCADI had lower GFR during follow-up. CADI over 2 was best predicted by creatinine, although with modest sensitivity and specificity. Neither cystatin C nor other estimates of GFR were superior to creatinine for CADI prediction. Cyclosporine A toxicity was seldom seen. Low cyclosporin A concentration after 2 h correlated with TGF- β1 expression in interstitial inflammatory cells, and this predicted worse graft function. Conclusions. The progression of CAN has been affected by two major factors: the donors’ characteristics and the recipients’ hypertension. The increased prevalence of DGF might be a consequence of the acceptance of older donors who had died from an underlying disease. Implant biopsies proved to be of prognostic value, and they are essential for comparison with subsequent biopsies. The progression of histological damage was associated with hypertension and dyslipidemia. The augmented expression of TGF-β1 in inflammatory cells is unclear, but it may be related to low immunosuppression. Serum creatinine is the most suitable tool for monitoring kidney allograft function on every-day basis. However, protocol biopsies at 6 and 12 months predicted late kidney allograft dysfunction and affected the clinical management of the patients. Protocol biopsies are thus a suitable surrogate to be used in clinical trials and for monitoring kidney allografts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Staphylococcus aureus is the second most common bloodstream isolate both in community- and hospital-acquired bacteremias. The clinical course of S. aureus bacteremia (SAB) is determined by its complications, particularly by the development of deep infections and thromboembolic events. Despite the progress of antimicrobial therapy, SAB is still associated with high mortality. However, injection drug users (IDUs) tend to have fewer complications and better prognosis than nonaddicts, especially in endocarditis. The present study was undertaken to investigate epidemiology, treatment and outcome of S. aureus bacteremia and endocarditis in Finland. In particular, differences in bacterial strains and their virulence factors, and host immune responses were compared between IDUs and nonaddicts. In Finland, 5045 SAB cases during 1995-2001 were included using the National Infectious Disease Register maintained by National Public Health Institute. The annual incidence of SAB increased, especially in elderly. While the increase in incidence may partly be explained by better reporting, it most likely reflects a growing population at risk, affected by such factors as age and/or severe comorbidity. Nosocomial infections accounted for 51% of cases, with no change in their proportion during the study period. The 28-day mortality was 17% and remained unchanged over time. A total of 381 patients with SAB were randomized to receive either standard antibiotic treatment or levofloxacin added to standard treatment. Levofloxacin combination therapy did not decrease the mortality, lower the incidence of deep infections, nor did it speed up the recovery during 3 months follow-up. However, patients with a deep infection appeared to benefit from combination therapy with rifampicin, as suggested also by experimental data. Deep infections were found in 84% of SAB patients within one week after randomization, and they appeared to be more common than previously reported. Endocarditis was observed in 74 of 430 patients (17%) with SAB, of whom 20 were IDUs and 54 nonaddicts. Right-sided involvement was diagnosed in 60% of addicts whereas 93% of nonaddicts had left-sided endocarditis. Unexpectedly, IDUs showed extracardiac deep infections, thromboembolic events and severe sepsis with the same frequency as nonaddicts. The prognosis of endocarditis was better among addicts due to their younger age and lack of underlying diseases in agreement with earlier reports. In total, all 44 IDUs with SAB were included and 20 of them had endocarditis. An equal number of nonaddicts with SAB were chosen as group matched controls. Serological tests were not helpful in identifying patients with a deep infection. No individual S. aureus strain dominated in endocarditis among addicts. Characterization of the virulence factors of bacterial strains did not reveal any significant differences in IDUs and nonaddicts.