946 resultados para Field study
Resumo:
A cross-sectional field study on the prevalence of Dicrocoelium dendriticum was performed in the Emmental. The study included 211 bovines, 170 equines, 20 ovines, 46 caprines and 23 rabbits (from 119 farms). In addition, laboratory routine diagnostic data obtained from 2.840 animals--all originating from the same area of investigation--were assessed in the same way. The infection extent concerning the different animal species were the following: bovines 46%, equines 12%, ovines 30%, caprines 48% and rabbits 9%. Univariate analyses of baseline epidemiological data identified no significant risk factors, with the exception of the type of stable used. Bovines kept in a modern free ranging stable had a significantly lower chance of infection with D. dendriticum than cattle in conventional tie stalls. The epidemiological data characterizing the area of investigation suggest the following procedure to reduce the problem of dicrocoeliosis: Pasturing animals of all ages should be regularly dewormed (e.g. every six week during pasture) using a compound effective against D. dendriticum. A treatment is especially indicated at the time after pasture in autumn or before housing the animals in winter. In spring, only animals having pastured the year before need to be treated prior to pasture in the new year. However, it is recommended to perform an economic analysis comparing costs of treatment versus putative costs of damage prior to the initiation of a strategic campaign: animal welfare aspects have to be considered. The laboratory routine diagnostic data showed infection extent similar to those of the cross-sectional study: bovines 60%, equines 24%, ovines 26%, caprines 31%, rabbits 32%. Atypical hosts such as dogs and cats exhibited low infection extent (3% and 1%, respectively), rather reflecting a gastro-intestinal passage of parasite eggs ingested by consumption of infected livers or by coprophagy of ruminant faeces.
Resumo:
A double-blinded, randomised, placebo-controlled field study of the influence of prostaglandin E2 (PGE2) on cattle at parturition was carried out. The extent of cervical opening and the intensity of labour were scored before administration of the compound and 10 minutes later; routine birth assistance was then continued by the veterinarian. Successful birth occurred more quickly in the cows treated with PGE2. The extent of cervical opening before the administration of the drug had a significant effect on the time to delivery, but the intensity of labour and a concomitant infusion of calcium did not have significant effects on this period. The less open the cervix before administration of the drug, the more the duration of parturition differed between the two groups, with the placebo group taking longer. A telephone follow-up inquiry found no significant differences between the cows postpartum; there were cases of mastitis and hypocalcaemia in both groups. The incidence of retained fetal membranes and the mortality of the calves were higher in the placebo group, but in neither case was the difference significant.
Resumo:
Switchgrass (Panicum virgatum L.) is a perennial grass holding great promise as a biofuel resource. While Michigan’s Upper Peninsula has an appropriate land base and climatic conditions, there is little research exploring the possibilities of switchgrass production. The overall objectives of this research were to investigate switchgrass establishment in the northern edge of its distribution through: investigating the effects of competition on the germination and establishment of switchgrass through the developmental and competitive characteristics of Cave-in-Rock switchgrass and large crabgrass (Digitaria sanguinalis L.) in Michigan’s Upper Peninsula; and, determining the optimum planting depths and timing for switchgrass in Michigan’s Upper Peninsula. For the competition study, a randomized complete block design was installed June 2009 at two locations in Michigan’s Upper Peninsula. Four treatments (0, 1, 4, and 8 plants/m2) of crabgrass were planted with one switchgrass plant. There was a significant difference between switchgrass biomass produced in year one, as a function of crabgrass weed pressure. There was no significant difference between the switchgrass biomass produced in year two versus previous crabgrass weed pressure. There is a significant difference between switchgrass biomass produced in year one and two. For the depth and timing study, a completely randomized design was installed at two locations in Michigan’s Upper Peninsula on seven planting dates (three fall 2009, and four spring 2010); 25 seeds were planted 2 cm apart along 0.5 m rows at depths of: 0.6 cm, 1.3 cm, and 1.9 cm. Emergence and biomass yields were compared by planting date, and depths. A greenhouse seeding experiment was established using the same planting depths and parameters as the field study. The number of seedlings was tallied daily for 30 days. There was a significant difference in survivorship between the fall and spring planting dates, with the spring being more successful. Of the four spring planting dates, there was a significant difference between May and June in emergence and biomass yield. June planting dates had the most percent emergence and total survivorship. There is no significant difference between planting switchgrass at depths of 0.6 cm, 1.3 cm, and 1.9 cm. In conclusion, switchgrass showed no signs of a legacy effect of competition from year one, on biomass production. Overall, an antagonistic effect on switchgrass biomass yield during the establishment period has been observed as a result of increasing competing weed pressure. When planting switchgrass in Michigan’s Upper Peninsula, it should be done in the spring, within the first two weeks of June, at any depth ranging from 0.6 cm to 1.9 cm.
Resumo:
The subject to be covered by this paper is based upon field study made during a six week stay at Jardine. The work began on June 19, 1937 and ended on July 31 of the same year.
Evolutionary demography of long-lived monocarpic perennials: a time-lagged integral projection model
Resumo:
1. The evolution of flowering strategies (when and at what size to flower) in monocarpic perennials is determined by balancing current reproduction with expected future reproduction, and these are largely determined by size-specific patterns of growth and survival. However, because of the difficulty in following long-lived individuals throughout their lives, this theory has largely been tested using short-lived species (< 5 years). 2. Here, we tested this theory using the long-lived monocarpic perennial Campanula thyrsoides which can live up to 16 years. We used a novel approach that combined permanent plot and herb chronology data from a 3-year field study to parameterize and validate integral projection models (IPMs). 3. Similar to other monocarpic species, the rosette leaves of C. thyrsoides wither over winter and so size cannot be measured in the year of flowering. We therefore extended the existing IPM framework to incorporate an additional time delay that arises because flowering demography must be predicted from rosette size in the year before flowering. 4. We found that all main demographic functions (growth, survival probability, flowering probability and fecundity) were strongly size-dependent and there was a pronounced threshold size of flowering. There was good agreement between the predicted distribution of flowering ages obtained from the IPMs and that estimated in the field. Mostly, there was good agreement between the IPM predictions and the direct quantitative field measurements regarding the demographic parameters lambda, R-0 and T. We therefore conclude that the model captures the main demographic features of the field populations. 5. Elasticity analysis indicated that changes in the survival and growth function had the largest effect (c. 80%) on lambda and this was considerably larger than in short-lived monocarps. We found only weak selection pressure operating on the observed flowering strategy which was close to the predicted evolutionary stable strategy. 6. Synthesis. The extended IPM accurately described the demography of a long-lived monocarpic perennial using data collected over a relatively short period. We could show that the evolution of flowering strategies in short- and long-lived monocarps seem to follow the same general rules but with a longevity-related emphasis on survival over fecundity.
Resumo:
Proliferative kidney disease (PKD) is a temperature-dependent disease caused by the myxozoan Tetracapsuloides bryosalmonae. It is an emerging threat to wild brown trout Salmo trutta fario populations in Switzerland. Here we examined (1) how PKD prevalence and pathology in young-of-the-year (YOY) brown trout relate to water temperature, (2) whether wild brown trout can completely recover from T. bryosalmonae-induced renal lesions and eliminate T. bryo - salmonae over the winter months, and (3) whether this rate and/or extent of the recovery is influenced by concurrent infection. A longitudinal field study on a wild brown trout cohort was conducted over 16 mo. YOY and age 1+ fish were sampled from 7 different field sites with various temperature regimes, and monitored for infection with T. bryosalmonae and the nematode Raphidascaris acus. T. bryosamonae was detectable in brown trout YOY from all sampling sites, with similar renal pathology, independent of water temperature. During winter months, recovery was mainly influenced by the presence or absence of concurrent infection with R. acus larvae. While brown trout without R. acus regenerated completely, concurrently infected brown trout showed incomplete recovery, with chronic renal lesions and incomplete translocation of T. bryosalmonae from the renal interstitium into the tubular lumen. Water temperature seemed to influence complete excretion of T. bryosalmonae, with spores remaining in trout from summer-warm rivers, but absent in trout from summer-cool rivers. In the following summer months, we found PKD infections in 1+ brown trout from all investigated river sites. The pathological lesions indicated a reinfection rather than a proliferation of remaining T. bryosalmonae. However, disease prevalence in 1+ trout was lower than in YOY.
Resumo:
Target difficulty is often argued to increase performance. While this association is well established in experimental research, empirical evidence in field research is rather mixed. We attempt to explain this inconsistency by analyzing the importance of intra-year target revisions, which are especially prevalent in real-world field settings. Using survey and archival data from 97 firms, we find that firms with more challenging business unit targets revise targets more often, in line with asymmetric, downward target revisions. Results further show that the degree to which targets are revised during a period results in negative effects on firm performance, as the anticipation of revision negatively affects the business unit management’s performance incentives. Additionally, we find that using targets predominantly for either decision-making or control influences the overall performance effects of target revisions. Our findings may partially explain the mixed field study evidence regarding the effects of target difficulty.
Resumo:
Background: We previously found good psychometric properties of the Inventory for assessment of stress management skills (German translation: Inventar zur Erfassung von Stressbewältigungsfertigkeiten), ISBF, a short questionnaire for combined assessment of different perceived stress management skills in the general population. Here, we investigate whether stress management skills as measured by ISBF relate to cortisol stress reactivity in two independent studies, a laboratory study (study 1) and a field study (study 2). Methods: 35 healthy non-smoking and medication-free men in study 1 (age mean±SEM:38.0±1.6) and 35 male and female employees in study 2 (age mean±SEM:32.9±1.2) underwent an acute standardized psychosocial stress task combining public speaking and mental arithmetic in front of an audience. We assessed stress management skills (ISBF) and measured salivary cortisol before and after stress and several times up to 60 min (study 2) and 120 min (study 1) thereafter. Potential confounders were controlled. Results:. General linear models controlling for potential confounders revealed that in both studies, higher stress management skills (ISBF total score) were independently associated with lower cortisol levels before and after stress (main effects ISBF: p’s<.055) and lower cortisol stress reactivity (interaction ISBF-by-stress: p’s<.029). Post-hoc-testing of ISBF subscales suggest lower cortisol stress reactivity with higher “relaxation abilities” (both studies) and higher scores in the scale “cognitive strategies and problem solving” (study 2). Conclusions: Our findings suggest blunted increases in cortisol following stress with increasing stress management skills as measured by ISBF. This suggests that the ISBF not only relates to subjective psychological but also objective physiological stress indicators which may further underscore the validity of the questionnaire.
Resumo:
An understanding of interruptions in healthcare is important for the design, implementation, and evaluation of health information systems and for the management of clinical workflow and medical errors. The purpose of this study is to identify and classify the types of interruptions experienced by Emergency Department(ED) nurses working in a Level One Trauma Center. This was an observational field study of Registered Nurses (RNs) employed in a Level One Trauma Center using the shadowing method. Results of the study indicate that nurses were both recipients and initiators of interruptions. Telephones, pagers, and face-to-face conversations were the most common sources of interruptions. Unlike other industries, the healthcare community has not systematically studied interruptions in clinical settings to determine and weigh the necessity of the interruption against their sometimes negative results such as medical errors, decreased efficiency, and increased costs. Our study presented here is an initial step to understand the nature, causes, and effects of interruptions, thereby improving both the quality of healthcare and patient safety. We developed an ethnographic data collection technique and a data coding method for the capturing and analysis of interruptions. The interruption data we collected are systematic, comprehensive, and close to exhaustive. They confirmed the findings from earlier studies by other researchers that interruptions are frequent events in critical care and other healthcare settings. We are currently using these data to analyze the workflow dynamics of ED clinicians, to identify the bottlenecks of information flow, and to develop interventions to improve the efficiency of emergency care through the management of interruptions.
Resumo:
An understanding of interruptions in healthcare is important for the design, implementation, and evaluation of health information systems and for the management of clinical workflow and medical errors. The purpose of this study is to identify and classify the types of interruptions experienced by ED nurses working in a Level One Trauma Center. This was an observational field study of Registered Nurses employed in a Level One Trauma Center using the shadowing method. Results of the study indicate that nurses were both recipients and initiators of interruptions. Telephone, pagers, and face-to-face conversations were the most common sources of interruptions. Unlike other industries, the outcomes caused by interruptions resulting in medical errors, decreased efficiency and increased cost have not been systematically studied in healthcare. Our study presented here is an initial step to understand the nature, causes, and effects of interruptions, and to develop interventions to manage interruptions to improve healthcare quality and patient safety. We developed an ethnographic data collection technique and a data coding method for the capturing and analysis of interruptions. The interruption data we collected are systematic, comprehensive, and close to exhaustive. They confirmed the findings from early studies by other researchers that interruptions are frequent events in critical care and other healthcare settings. We are currently using these data to analyze the workflow dynamics of ED clinicians, identify the bottlenecks of information flow, and develop interventions to improve the efficiency of emergency care through the management of interruptions.
Resumo:
The purpose of this prospective observational field study was to present a model for measuring energy expenditure among nurses and to determine if there was a difference between the energy expenditure of nurses providing direct care to adult patients on general medical-surgical units in two major metropolitan hospitals and a recommended energy expenditure of 3.0 kcal/minute over 8 hours. One-third of the predicted cycle ergometer VO2max for the study population was used to calculate the recommended energy expenditure.^ Two methods were used to measure energy expenditure among participants during an 8 hour day shift. First, the Energy Expenditure Prediction Program (EEPP) developed by the University of Michigan Center for Ergonomics was used to calculate energy expenditure using activity recordings from observation (OEE; n = 39). The second method used ambulatory electrocardiography and the heart rate-oxygen consumption relationship (HREE; n = 20) to measure energy expenditure. It was concluded that energy expenditure among nurses can be estimated using the EEPP. Using classification systems from previous research, work load among the study population was categorized as "moderate" but was significantly less than (p = 0.021) 3.0 kcal/minute over 8 hours or 1/3 of the predicted VO2max.^ In addition, the relationships between OEE, body-part discomfort (BPCDS) and mental work load (MWI) were evaluated. The relationships between OEE/BPCDS and OEE/MWI were not significant (p = 0.062 and 0.091, respectively). Among the study population, body-part discomfort significantly increased for upper arms, mid-back, lower-back, legs and feet by mid-shift and by the end of the shift, the increase was also significant for neck and thighs.^ The study also provided documentation of a comprehensive list of nursing activities. Among the most important findings were the facts that the study population spent 23% of the workday in a bent posture, walked an average of 3.14 miles, and spent two-thirds of the shift doing activities other than direct patient care, such as paperwork and communicating with other departments. A discussion is provided regarding the ergonomic implications of these findings. ^
Resumo:
Evidence-based decisions on indicated prevention in early psychosis require large-scale studies on the pathways to care in high-risk subjects. EPOS (The European Prediction of Psychosis Study), a prospective multi-center, naturalistic field study in four European countries (Finland, Germany, The Netherlands and England), was designed to acquire accurate knowledge about pathways to care and delay in obtaining specialized high risk care. Our high risk sample (n=233) reported on average 2.9 help-seeking contacts, with an average delay between onset of relevant problems to initial help-seeking contact of 72.6 weeks, and between initial help-seeking contact and reaching specialized high risk care of 110.9 weeks. This resulted in a total estimated duration of an unrecognized risk for psychosis of 3 ½ years. Across EPOS EU regions, about 90% of care pathway contacts were within professional health care sectors. Between EPOS regions, differences in the pathways parameters including early detection and health-care systems were often very pronounced. High-risk participants who later made transition to a full psychotic disorder had significantly longer delays between initial help-seeking and receiving appropriate interventions. Our study underlines the need for regionally adapted implementation of early detection and intervention programs within respective mental health and health care networks, including enhancing public awareness of early psychosis.
Resumo:
Ethiopia has for a long time been one of the world’s most food-insecure countries. Efforts by the government and a multitude of sponsors including NGOs have developed an array of institutions and instruments to mitigate the negative impact of production and supply disruptions. Public stockpiles are one such tool, the use of which is rapidly increasing worldwide. This brief field study examines the Ethiopian policies and practice in context, including various instruments operated by farmers, processors and traders. The study finds that the multiple objectives assigned to food reserves as well as the present management structure may not be well-suited at a time of high world market prices and when international food aid is dwindling, and as the international regulatory trade and investment environment remains a matter of unfinished business from a global food security perspective. A comprehensive study of various options for improvements would lay out policy alternatives for public authorities and stakeholders.
Resumo:
Seed production, seed dispersal, and seedling recruitment are integral to forest dynamics, especially in masting species. Often these are studied separately, yet scarcely ever for species with ballistic dispersal even though this mode of dispersal is common in legume trees of tropical African rain forests. Here, we studied two dominant main-canopy tree species, Microberlinia bisulcata and Tetraberlinia bifoliolata (Caesalpinioideae), in 25 ha of primary rain forest at Korup, Cameroon, during two successive masting events (2007/2010). In the vicinity of c. 100 and 130 trees of each species, 476/580 traps caught dispersed seeds and beneath their crowns c. 57,000 pod valves per species were inspected to estimate tree-level fecundity. Seed production of trees increased non-linearly and asymptotically with increasing stem diameters. It was unequal within the two species’ populations, and differed strongly between years to foster both spatial and temporal patchiness in seed rain. The M. bisulcata trees could begin seeding at 42–44 cm diameter: at a much larger size than could T. bifoliolata (25 cm). Nevertheless, per capita life-time reproductive capacity was c. five times greater in M. bisulcata than T. bifoliolata owing to former’s larger adult stature, lower mortality rate (despite a shorter life-time) and smaller seed mass. The two species displayed strong differences in their dispersal capabilities. Inverse modelling (IM) revealed that dispersal of M. bisulcata was best described by a lognormal kernel. Most seeds landed at 10–15 m from stems, with 1% of them going beyond 80 m (<100 m). The direct estimates of fecundity significantly improved the models fitted. The lognormal also described well the seedling recruitment distribution of this species in 121 ground plots. By contrast, the lower intensity of masting and more limited dispersal of the heavier-seeded T. bifoliolata prevented reliable IM. For this species, seed density as function of distance to traps suggested a maximum dispersal distance of 40–50 m, and a correspondingly more aggregated seedling recruitment pattern ensued than for M. bisulcata. From this integrated field study, we conclude that the reproductive traits of M. bisulcata give it a considerable advantage over T. bifoliolata by better dispersing more seeds per capita to reach more suitable establishment sites, and combined with other key traits they explain its local dominance in the forest. Understanding the linkages between size at onset of maturity, individual fecundity, and dispersal capability can better inform the life-history strategies, and hence management, of co-occurring tree species in tropical forests.
Resumo:
Background. In the field of information technology (IT) time pressure is common. Working with tight deadlines together on the same task increases the risk of social stressors referring to tensions and conflicts at work. Purpose. This field study tested both the association of time pressure and social stressors with blood pressure during work. Method. Seven employees – staff of a small IT enterprise – participated in repeated ambulatory blood pressure measurements over the course of one week. Time pressure and social stressors at work were assessed by questionnaire at the beginning of the study. Results. Multilevel regression analyses of 138 samples revealed higher levels of time pressure to be related to marginally significant increases in mean arterial blood pressure at noon and in the afternoon. In addition, higher levels of social stressors at work were significantly associated to elevated mean arterial pressure in the afternoon. Conclusion. Findings support the view that threats to the social self play an important role in occupational health.