995 resultados para Seed survival
Resumo:
Salvage logging is a common practice carried out in burned forests worldwide, and intended to mitigate the economic losses caused by wildfires. Logging implies an additional disturbance occurring shortly after fire, although its ecological effects can be somewhat mitigated by leaving wood debris on site. The composition of the bird community and its capacity to provide ecosystem services such as seed dispersal offleshy-fruited plants have been shown to be affected by postfire logging. We assessedthe effects of the habitat structure resulting from different postfire management practices on the bird community, in three burned pine forests in Catalonia (western Mediterranean). For this purpose, we focused on the group of species that is responsible for seed dispersal, a process which takes place primarily during the winter in theMediterranean basin. In addition, we assessed microhabitat selection by seed disperser birds in such environments in relation to management practices. Our results showed a significant, positive relationship between the density of wood debris piles and the abundance of seed disperser birds. Furthermore, such piles were the preferredmicrohabitat of these species. This reveals an important effect of forest management on seed disperser birds, which is likely to affect the dynamics of bird-dependent seed dispersal. Thus, building wood debris piles can be a useful practice for the conservation of both the species and their ecosystem services, while also being compatible with timber harvesting
Resumo:
The European Cancer Registry-based project on hematologic malignancies (HAEMACARE), setup to improve the availability and standardization of data on hematologic malignancies in Europe, used the European Cancer Registry-based project on survival and care of cancer patients (EUROCARE-4) database to produce a new grouping of hematologic neoplasma(defined by the International Classification of Diseases for Oncology, Third Edition and the 2001/2008 World Health Organization classifications) for epidemiological and public healthpurposes. We analyzed survival for lymphoid neoplasms in Europe by disease group, comparing survival between different European regions by age and sex. Design and Methods Incident neoplasms recorded between 1995 to 2002 in 48 population-based cancer registries in 20 countries participating in EUROCARE-4 were analyzed. The period approach was used to estimate 5-year relative survival rates for patients diagnosed in 2000-2002, who did not have 5years of follow up. Results: The 5-year relative survival rate was 57% overall but varied markedly between the definedgroups. Variation in survival within the groups was relatively limited across European regions and less than in previous years. Survival differences between men and women were small. The relative survival for patients with all lymphoid neoplasms decreased substantially after the age of 50. The proportion of ‘not otherwise specified’ diagnoses increased with advancing age.Conclusions: This is the first study to analyze survival of patients with lymphoid neoplasms, divided into groups characterized by similar epidemiological and clinical characteristics, providing a benchmarkfor more detailed analyses. This Europe-wide study suggests that previously noted differences in survival between regions have tended to decrease. The survival of patients with all neoplasms decreased markedly with age, while the proportion of ‘not otherwise specified’ diagnoses increased with advancing age. Thus the quality of diagnostic work-up and care decreased with age, suggesting that older patients may not be receiving optimal treatment
Resumo:
During the last part of the 1990s the chance of surviving breast cancer increased. Changes in survival functions reflect a mixture of effects. Both, the introduction of adjuvant treatments and early screening with mammography played a role in the decline in mortality. Evaluating the contribution of these interventions using mathematical models requires survival functions before and after their introduction. Furthermore, required survival functions may be different by age groups and are related to disease stage at diagnosis. Sometimes detailed information is not available, as was the case for the region of Catalonia (Spain). Then one may derive the functions using information from other geographical areas. This work presents the methodology used to estimate age- and stage-specific Catalan breast cancer survival functions from scarce Catalan survival data by adapting the age- and stage-specific US functions. Methods: Cubic splines were used to smooth data and obtain continuous hazard rate functions. After, we fitted a Poisson model to derive hazard ratios. The model included time as a covariate. Then the hazard ratios were applied to US survival functions detailed by age and stage to obtain Catalan estimations. Results: We started estimating the hazard ratios for Catalonia versus the USA before and after the introduction of screening. The hazard ratios were then multiplied by the age- and stage-specific breast cancer hazard rates from the USA to obtain the Catalan hazard rates. We also compared breast cancer survival in Catalonia and the USA in two time periods, before cancer control interventions (USA 1975–79, Catalonia 1980–89) and after (USA and Catalonia 1990–2001). Survival in Catalonia in the 1980–89 period was worse than in the USA during 1975–79, but the differences disappeared in 1990–2001. Conclusion: Our results suggest that access to better treatments and quality of care contributed to large improvements in survival in Catalonia. On the other hand, we obtained detailed breast cancer survival functions that will be used for modeling the effect of screening and adjuvant treatments in Catalonia
Resumo:
The recovery of vegetation in Mediterranean ecosystems after wildfire is mostly a result of direct regeneration, since the same species existing before the fire regenerate on-site by seeding or resprouting. However, the possibility of plant colonization by dispersal of seeds from unburned areas remains poorly studied. We addressed the role of the frugivorous, bird-dependent seed dispersal (seed rain) of fleshy-fruited plants in a burned and managed forest in the second winter after a fire, before on-site fruit production had begun. We also assessed the effect on seed rain of different microhabitats resulting from salvage logging (erosion barriers, standing snags, open areas), as well as the microhabitats of unlogged patches and an unburned control forest, taking account of the importance of perches as seed rain sites. We found considerable seed rain by birds in the burned area. Seeds, mostly from Olive trees Olea europaea and Evergreen pistaches Pistacia lentiscus, belonged to plants fruiting only in surrounding unburned areas. Seed rain was heterogeneous, and depended on microhabitat, with the highest seed density in the unburned control forest but closely followed by the wood piles of erosion barriers. In contrast, very low densities were found under perches of standing snags. Furthermore, frugivorous bird richness seemed to be higher in the erosion barriers than elsewhere. Our results highlight the importance of this specific post-fire management in bird-dependent seed rain and also may suggest a consequent heterogeneous distribution of fleshy-fruited plants in burned and managed areas. However, there needs to be more study of the establishment success of dispersed seeds before an accurate assessment can be made of the role of bird-mediated seed dispersal in post-fire regeneration
Resumo:
This study evaluated the adsorption capacity of chromium from contaminated aqueous solutions by using Moringa oleifera Lam. seeds. Parameters such as solution pH, adsorbent mass, contact time between solution and adsorbent, isotherms, thermodynamic, kinetics, and desorption were evaluated. The maximum adsorption capacity (Qm) calculated to be 3.191 mg g-1 for the biosorbent. Activated carbon was used for comparison purposes in addition to the biosorbent. The best fit was obtained by the Langmuir model for both adsorbents. The average desorption value indicated that both the biosorbent and activated carbon have a strong interaction with the metal. The results showed that the biosorbent has advantages owing to its low cost and efficiency in Cr3+ removal from contaminated waters.
Resumo:
Rice husk silica (RHS) and NaY were used as supports for potassium (K) prepared from acetate buffer (B) and acetate (A) solutions. K loading did not destroy the NaY structure, but it caused a decrease in the surface area; the K species resided in micropores and on the external surface. In contrast, K loading resulted in the collapse and a decrease in the surface area of RHS. It was found that 12K/NaY-B was the most active catalyst for the transesterification of Jatropha seed oil. The minimum K content in K/NaY-B that provided complete conversion of the Jatropha seed oil was 11 wt%, and the biodiesel yield was 77.9%.
Resumo:
Detection, symptoms and symptomless transmission of Drechslera avenae (teleomorph Pyrenophora avenae) from seed were investigated. The present study reported that the D. avenae is frequently found in oat (Avena sativa) seed in Argentina. The prevalence of the seed lot infected was 54%. The incidence of seeds infected by D. avenae in the various seed lots from different regions ranged from 0 to 52% (overall mean of 7%). Besides conidia on conidiophore and immature pseudothecia, D. avenae produced small, spherical to pear-shape picnidia containing tiny conidia, on the seed surface. The pathogen was efficiently transmitted at a rate of 55% from seed to coleoptile tips in symptomless seedlings and at 12.5% to plumules. The importance of the infected seed and its epidemiological role are discussed.
Resumo:
A semi-selective agar medium was developed for detection of Xanthomonas axonopodis pv. malvacearum (Xam) in cotton (Gossypium hirsutum) seed. The basic medium was peptone-sucrose-agar (PSA). Criteria for the semi-selective medium were the typical colony characters of Xam and its pathogenicity on cotton. Several systemic fungicides and antibiotics in different concentrations were tested alone or in combination with others. The final composition of the semi-selective agar medium was established after several attempts in order to inhibit most of the fungal and bacterial saprophytes and favour the development of Xam. It contained PSA + cyclohexamide, cephalexin, pencycuron, triadimenol and tolylfluanid. The bacteria were recovered from naturally infected seeds by the direct plating of 2,000 surface disinfected seeds on the semi-selective medium. The recovery of the pathogen from naturally infected leaf tissues and in dilution plating, on semi-selective medium and on nutrient agar, were comparable. Among the three detection methods tested, the semi-selective medium was found to be the most reliable and quantifiable. Degree of severity of angular leaf spot in the field was not always correlated with the level of infection in the seed. This is the first report of a semi-selective agar medium to detect the presence of Xam in naturally infected cotton seed.
Resumo:
Botrytis blight caused by Botrytis cinerea is an important disease of rose (Rosa hybrida) grown in greenhouses in Brazil. As little is known regarding the disease epidemiology under greenhouse conditions, pathogen survival in crop debris and as sclerotia was evaluated. Polyethylene bags with petals, leaves, or stem sections artificially infected with B. cinerea were mixed with crop debris in rose beds, in a commercial plastic greenhouse. High percentage of plant parts with sporulation was detected until 60 days, then sporulation decreased on petals after 120 days, and sharply decreased on stems or leaves after 90 days. Sporulation on petals continued for 360 days, but was not observed on stems after 150 days or leaves after 240 days. Although the fungus survived longer on petals, stems and leaves are also important inoculum sources because high amounts of both are deposited on beds during cultivation. Survival of sclerotia produced on PDA was also quantified. Sclerotia germination was greater than 75% in the initial 210 days and 50% until 360 days. Sclerotia weight gradually declined but they remained viable for 360 days. Sclerotia were produced on the buried petals, mainly after 90 days of burial, but not on leaves or stems. Germination of these sclerotia gradually decreased after 120 days, but lasted until 360 days. Higher weight loss and lower viability were observed on sclerotia produced on petals than on sclerotia produced in vitro
Resumo:
Lower extremity peripheral arterial disease (PAD) is associated with decreased functional status, diminished quality of life (QoL), amputation, myocardial infarction, stroke, and death. Nevertheless, public awareness of PAD as a morbid and mortal disease is low. The aim of this study was to assess the incidence of major lower extremity amputation due to PAD, the extent of reamputations, and survival after major lower extremity amputation (LEA) in a population based PAD patient cohort. Furthermore, the aim was to assess the functional capacity in patients with LEA, and the QoL after lower extremity revascularization and major amputation. All 210 amputees due to PAD in 1998–2002 and all 519 revascularized patients in 1998–2003 were explored. 59 amputees alive in 2004 were interviewed using a structured questionnaire of QoL. Two of each amputee age-, gender- and domicile-matched controls filled in and returned postal self-administered QoL questionnaire as well as 231 revascularized PAD patients (the amount of these patients who engaged themselves to the study), and one control person for each patient completed postal self-administered QoL questionnaire. The incidence rate of major LEA was 24.1/100 000 person-years and it was considerably high during the years studied. The one-month mortality rate was 21%, 52% at one-year, and the overall mortality rate was 80%. When comparing the one-year mortality risk of amputees, LEAs were associated with a 7.4-fold annual mortality risk compared with the reference population in Turku. Twenty-two patients (10%) had ipsilateral transversions from BK to AK amputation. Fifty patients (24%) ended up with a contralateral major LEA within two to four amputation operations. Three bilateral amputations were performed at the first major LEA operation. Of the 51 survivors returning home after their first major LEA, 36 (71%) received a prosthesis; (16/36, 44%) and were able to walk both in- and outdoors. Of the 68 patients who were discharged to institutional care, three (4%) had a prosthesis one year after LEA. Both amputees and revascularized patients had poor physical functioning and significantly more depressive symptoms than their controls. Depressive symptoms were more common in the institutionalized amputees than the home-dwelling amputees. The surviving amputees and their controls had similar life satisfaction. The amputees felt themselves satisfied and contented, whether or not they lived in long-term care or at home. PAD patients who had undergone revascularizations had poorer QoL than their controls. The revascularized patients’ responses on their perceived physical functioning gave an impression that these patients are in a declining life cycle and that revascularizations, even when successful, may not be sufficient to improve the overall function. It is possible that addressing rehabilitation issues earlier in the care may produce a more positive functional outcome. Depressive symptoms should be recognized and thoroughly considered at the same time the patients are recovering from their revascularization operation. Also primary care should develop proper follow-up, and community organizations should have exercise groups for those who are able to return home, since they very often live alone. In rehabilitation programs we should consider not only physical disability assessment but also QoL.