629 resultados para Wymer, Beth
Resumo:
White-nose syndrome (WNS) is a disease that has caused the mass mortality of hibernating bat species. Since its first discovery in the winter of 2006-2007, an estimated five million bats or more have been killed. Although infection with Pseudogymnoascus destructans (Pd, the causative agent of WNS) does not always result in death, bats that survive Pd infection may experience fitness consequences. To understand the physiological consequences of WNS, I measured reproductive rates of free-ranging hibernating bat species of the Northeastern United States. In addition, captive little brown myotis (Myotis lucifugus) bats that were infected by Pd but survived (¿WNS survivors¿) and uninfected bats were studied in order to understand the potential consequences (e.g., lower reproductive rates, decreased ability to heal wounds, degradation of wing tissue, and altered metabolic rates) of surviving WNS. No differences in reproductive rates were found between WNS-survivors and uninfected bats in either the field or in captivity. In addition, wound healing was not affected by Pd infection. However, wing tissue degradation was worse for little brown myotis 19 days post-hibernation, and mass specific metabolic rate (MSMR) was significantly higher for those infected with Pd 22 days post-hibernation. While it is clear that these consequences are a direct result of Pd infection, further research investigating the long-term consequences for both mothers and pups is necessary.
Resumo:
The article addresses the questions, What do children in urban areas do on Saturdays? What types of organizational resources do they have access to? Does this vary by social class? Using diary data on children's activities on Saturdays in the Phoenix-Mesa-Scottsdale metropolitan area, the authors describe the different types of venues (households, businesses, public space, associations, charities, congregations, and government/tribal agencies) that served different types of children. They find that the likelihood of using a charity or business rather than a government or tribal provider increased with family income. Also, the likelihood of using a congregation or a government facility rather than a business, charity, or household increased with being Hispanic. The authors discuss the implications for the urban division of labor on Saturdays and offer research questions that need further investigation.
Resumo:
Two competing models exist for the formation of the Pennsylvania salient, a widely studied area of pronounced curvature in the Appalachian mountain belt. The viability of these models can be tested by compiling and analyzing the patterns of structures within the general hinge zone of the Pennsylvania salient. One end-member model suggests a NW-directed maximum shortening direction and no rotation through time in the culmination. An alternative model requires a two-phase development of the culmination involving NNW-directed maximum shortening overprinted by WNW-directed maximum shortening. Structural analysis at 22 locations throughout the Valley and Ridge and southern Appalachian Plateau Provinces of Pennsylvania are used to constrain orientations of the maximum shortening direction and establish whether these orientations have rotated during progressive deformation in the Pennsylvania salient's hinge. Outcrops of Paleozoic sedimentary rocks contain several orders of folds, conjugate faults, steeply dipping strike-slip faults, joints, conjugate en echelon gash vein arrays, spaced cleavage, and grain-scale finite strain indicators. This suite of structures records a complex deformation history similar to the Bear Valley sequence of progressive deformation. The available structural data from the Juniata culmination do not show a consistent temporal rotation of shortening directions and generally indicate uniform,
Resumo:
In this chapter, the impact of watershed acidification treatments on WS3 at the Fernow Experimental Forest (FEF) and at WS9 on vegetation is presented and summarized in a comprehensive way for the first time. WS7 is used as a vegetative reference basin for WS3, while untreated plots within WS9 are used as a vegetative reference for WS9. Bioindicators of acidification impacts that will be considered include several measures of tree and stand growth rates, foliar chemistry, bolewood chemistry, and herbaceous species composition and diversity. These studies enhance our understanding of the inter-relationships of changes in soil conditions caused by the acidification treatment and the condition of forest vegetation.
Resumo:
One of the conclusions reached during the Congressionally mandated National Acid Precipitation Program (NAPAP) was that, compared to ozone and other stress factors, the direct effects of acidic deposition on forest health and productivity were likely to be relatively minor. However, the report also concluded “the possibility of long-term (several decades) adverse effects on some soils appears realistic” (Barnard et al. 1990). Possible mechanisms for these long-term effects include: (1) accelerated leaching of base cations from soils and foliage, (2) increased mobilization of aluminum (Al) and other metals such as manganese (Mn), (3) inhibition of soil biological processes, including organic matter decomposition, and (4) increased bioavailability of nitrogen (N).
Resumo:
Additions of acid anions can alter the cycling of other nutrients and elements within an ecosystem. As strong acid ions move through a forest, they may increase the concentrations of nitrogen (N) and sulfur (S) in the soil solution and stream water. Such treatments also may increase or decrease the availability of other anions, cations and metal ions in the soil. A number of studies in Europe and North America have documented increases in base cation concentrations such as calcium (Ca) and magnesium (Mg) with increased N and S deposition (Foster and Nicolson 1988, Feger 1992, Norton et al. 1994, Adams et al. 1997, Currie et al. 1999, Fernandez et al. 2003). Experiments in Europe also have evaluated the response of forested watersheds to decreased deposition (Tietema et al. 1998, Lamersdorf and Borken 2004). In this chapter, we evaluate the effects of the watershed acidification treatment on the cycling of N, S, Ca, Mg and potassium (K) on Fernow WS3.
Resumo:
We studied temporal and spatial patterns of soil nitrogen (N) dynamics from 1993 to 1995 in three watersheds of Fernow Experimental Forest, W.V.: WS7 (24-year-old, untreated); WS4 (mature, untreated); and WS3 (24-year-old, treated with (NH4)2SO since 1989 at the rate of 35 kg Nha–1year–1). Net nitrification was 141, 114, and115 kg Nha–1year–1, for WS3, WS4, and WS7, respectively, essentially 100% of net N mineralization for all watersheds. Temporal (seasonal) patterns of nitrification were significantly related to soil moisture and ambient temperaturein untreated watersheds only. Spatial patterns of soil water NO3–of WS4 suggest that microenvironmental variabilitylimits rates of N processing in some areas of this N-saturated watershed, in part by ericaceous species in the herbaceous layer. Spatial patterns of soil water NO3–in treated WS3 suggest that later stages of N saturation may result inhigher concentrations with less spatial variability. Spatial variability in soil N variables was lower in treated WS3 versus untreated watersheds. Nitrogen additions have altered the response of N-processing microbes to environmental factors, becoming less sensitive to seasonal changes in soil moisture and temperature. Biotic processes responsible forregulating N dynamics may be compromised in N-saturated forest ecosystems.
Resumo:
Additions of nitrogen (N) have been shown to alter species diversity of plant communities, with most experimental studies having been carried out in communities dominated by herbaceous species. We examined seasonal and inter-annual patterns of change in the herbaceous layer of two watersheds of a central Appalachian hardwood forest that differed in experimental treatment. This study was carried out at the Fernow Experimental Forest, West Virginia, using two adjacent watersheds: WS4 (mature, second-growth hardwood stand, untreated reference), and WS3. Seven circular 0.04-ha sample plots were established in eachwatershed to represent its full range of elevation and slope aspect. The herbaceous layer was sampled by identifying and visually estimating cover (%) of all vascular plants. Sampling was carried out in mid-July of 1991 and repeated at approximately the same time in 1992. In 1994, these same plots were sampled each month fromMay to October. Seasonal patterns of herb layer dynamics were assessed for the complete 1994 data set, whereasinter-annual variability was based on plot data from 1991, 1992, and the July sample of 1994. There were nosignificant differences between watersheds for any sample year for any of the other herb layer characteristics measured, including herb layer cover, species richness, evenness, and diversity. Cover on WS4 decreased significantly from 1991 to 1992, followed by no change to 1994. By contrast, herb layer cover did not varysignificantly across years on WS3. Cover of the herbaceous layer of both watersheds increased from early in the growing season to the middle of the growing season, decreasing thereafter, with no significant differencesbetween WS3 and WS4 for any of the monthly cover means in 1994. Similar seasonal patterns found for herblayer cover—and lack of significant differences between watersheds—were also evident for species diversityand richness. By contrast, there was little seasonal change in herb layer species evenness, which was nearlyidentical between watersheds for all months except October. Seasonal patterns for individual species/speciesgroups were closely similar between watersheds, especially for Viola rotundifolia and Viola spp. Species richnessand species diversity were linearly related to herb layer cover for both WS3 and WS4, suggesting that spatialand temporal increases in cover were more related to recruitment of herb layer species than to growth of existingspecies. Results of this study indicate that there have been negligible responses of the herb layer to 6 yr of additions to WS3.
Resumo:
Nitrogen (N) saturation is an environmental concern for forests in the eastern U.S. Although several watersheds of the Fernow Experimental Forest (FEF), West Virginia exhibit symptoms of Nsaturation, many watersheds display a high degree of spatial variability in soil N processing. This study examined the effects of temperature on net N mineralization and nitrification in N-saturatedsoils from FEF, and how these effects varied between high N-processing vs. low N-processingsoils collected from two watersheds, WS3 (fertilized with [NH4]2SO4) and WS4 (untreated control). Samples of forest floor material (O2 horizon) and mineral soil (to a 5-cm depth) were taken from three subplots within each of four plots that represented the extremes of highest and lowest ratesof net N mineralization and nitrification (hereafter, high N and low N, respectively) of untreated WS4 and N-treated WS3: control/low N, control/high N, N-treated/low N, N-treated/high N. Forest floor material was analyzed for carbon (C), lignin,and N. Subsamples of mineral soil were extractedimmediately with 1 N KCl and analyzed for NH4+and NO3– to determine preincubation levels. Extracts were also analyzed for Mg, Ca, Al, and pH. To test the hypothesis that the lack of net nitrification observed in field incubations on the untreated/low N plot was the result of absence ofnitrifier populations, we characterized the bacterial community involved in N cycling by amplification of amoA genes. Remaining soil was incubated for 28 d at three temperatures (10, 20, and30°C), followed by 1 N KCl extraction and analysis for NH4+ and NO3–. Net nitrification was essentially 100% of net N mineralization for all samples combined. Nitrification rates from lab incubation sat all temperatures supported earlier observations based on field incubations. At 30°C, rates from N- t reated/high N were three times those of N-treated/low N. Highest rates were found for untreated/high N (two times greater than those of N-treated/high N), whereas untreated/low N exhibited no net nitrification. However, soils exhibitingno net nitrification tested positive for presence of nitrifying bacteria, causing us to reject our initial hypothesis. We hypothesize that nitrifier populations in such soil are being inhibited by a combination of low Ca:Al ratios in mineral soil and allelopathic interactions with mycorrhizae of ericaceous species in the herbaceous layer.
Resumo:
Silvicultural treatments represent disturbances to forest ecosystems often resulting in transient increases in net nitrification and leaching of nitrate and base cations from the soil. Response of soil carbon (C) is more complex, decreasing from enhanced soil respiration and increasing from enhanced postharvest inputs of detritus. Because nitrogen (N) saturation can have similar effects on cation mobility, timber harvesting in N-saturated forests may contribute to a decline in both soil C and base cation fertility, decreasing tree growth. Although studies have addressed effects of either forest harvesting or N saturation separately, few data exist on their combined effects. Our study examined the responses of soil C and N to several commercially used silvicultural treatments within the Fernow Experimental Forest, West Virginia, USA, a site with N-saturated soils. Soil analyses included soil organic matter (SOM), C, N, C/N ratios, pH, and net nitrification. We hypothesized the following gradient of disturbance intensity among silvicultural practices (from most to least intense): even-age with intensive harvesting (EA-I), even-age with extensive harvesting, even-age with commercial harvesting, diameter limit, and single-tree harvesting (ST). We anticipated that effects on soil C and N would be greatest for EA-I and least with ST. Tree species exhibited a response to the gradient of disturbance intensity, with early successional species more predominant in high-intensity treatments and late successional species more predominant in low-intensity treatments. Results for soil variables, however, generally did not support our predictions, with few significant differences among treatments and between treatments and their paired controls for any of the measured soil variables. Multiple regression indicated that the best predictors for net nitrification among samples were SOM (positive relationship) and pH (negative relationship). This finding confirms the challenge of sustainable management of N-saturated forests.
Resumo:
Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field. Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms. Recent reviews have described the range of assays that have been used for this purpose.(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi). Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes. This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response.
Resumo:
Attentional focus and practice schedules are important components in learning a new skill. For attention this includes focusing inward or outward, for practice this includes interference between tasks. Little is known about how the two interact. Four groups; blocked/extraneous (BE); blocked/skill-focused (BS); random/extraneous (RE); and random/skill-focused (RS), practiced 100 trials of golf putting and 64 trials of a key-pressing task in addition to responding to a random tone distracting attention towards or away from skill movement. Participants performed immediate and delayed retention tests. Results demonstrated the BE group had decreased RTE scores compared to the BS group. Immediate retention demonstrated superior scores for blocked practice. Delayed retention demonstrated superior CEVE scores for extraneous focus. For golf putting, both attention conditions with blocked practice learned faster compared to random groups. Posttest scores demonstrated the random and skill focused group to improve in all putting conditions.
Resumo:
One of the biggest challenges facing researchers trying to empirically test structural or institutional anomie theories is the operationalization of the key concept of anomie. This challenge is heightened by the data constraints involved in cross-national research. As a result, researchers have been forced to rely on surrogate or proxy measures of anomie and indirect tests of the theories. The purpose of this study is to examine an innovative and more theoretically sound measure of anomie and to test its ability to make cross-national predictions of serious crime. Our results are supportive of the efficacy of this construct to explain cross-national variations in crime rates. Nations with the highest rates of structural anomie also have the highest predicted rates of homicide.
Resumo:
A study was designed to collect a database of Iowa feedlot rations for determination of effective neutral detergent fiber (NDF) in complete diets from fiber analysis and particle size determination of individual feed ingredients and compare this with particle size determination of mixed wet rations. Seventy-one beef finishing total mixed rations were collected by ISU Extension Beef Field Specialists across Iowa. Producers were asked to complete a form assessing the acidosis risk associated with each ration. The average NDF of these diets was 25.9%. Of the total mixed rations 1.33 % remained in the top tray (>.75 in.), 47.27 % remained in the middle tray (>.31 in.), and 50.88 % was smaller than the .31 in screen. The effective NDF (eNDF) calculated from the eNDF of the ingredients averaged 10.56%. Estimated eNDF from total diet NDF and the percentage of the total diet in the top and middle trays averaged 12.47%. The calculated eNDF from non-grain sources alone averaged 3.6%. The percentage of digestive deads was weakly related to the percentage of the ration in the bottom tray (r=.19), the percentage in the top tray (r=- .46) and the effective NDF of the ration (r=-.23). The percentage of bloat was related to the total NDF of the diet (r=.28) and the effective fiber from non-grain sources (r=-.23). The number of off-feed incidences was related to the dry matter of the ration (r=.38), the apparent eNDF (r=-.28) and the percentage of ration in the bottom tray (r=.24). This study confirms that there is some relationship between effective NDF of the diet, effective NDF from non-grain sources or diet particle size; and acidosis indicators. These relationships are weak, however, indicating that other factors such as feedbunk management, feed processing, feed presentation and feed mixing likely also play a role in the incidence of acidosis in feedlot cattle.