933 resultados para History of medicine, 19th Century
Resumo:
Internal colonization in Switzerland is often seen in connection with the battle for cultivation in the Second World War, but the history of internal colonization in Switzerland is more complex. The food crisis in the First World War formed the horizon of experience for various actors from industry, consumer protection, the urban population and agriculture to start considering practical strategies for managing agricultural production. In this way, traditional spaces, such as rural and urban areas and economic roles, such as food producer, consumer and trader, overlapped and were newly conceived to some extent: people started thinking about utopias and how a modern society could be designed to be harmonious and resistant to crisis. The aim of this article is to trace some of the key points in this process for the interwar years in neutral Switzerland. In the process, the focus must be on the context of people’s mentalities in the past, although the relationships between the actors of internal colonization and the state also need to be considered. Internal colonization in Switzerland in the twentieth century can be understood as an open process. In principle, the project was driven by private actors, but in times of crisis, the project was claimed by the state as a possible tool for social and economic intervention. In addition, as a result of the planned dissolution of urban and rural spaces, it will be shown that modern societies in the interwar period were on an existential search to overcome the problems of the modern age. Internal colonization can therefore be seen as an attempt to find a third way between a world characterized by an agrarian society and a modern industrial nation.
Resumo:
by H.V. Hilprecht. With the cooperation of [Immanuel] Benzinger [u.a.]
Resumo:
BACKGROUND: Several parameters of heart rate variability (HRV) have been shown to predict the risk of sudden cardiac death (SCD) in cardiac patients. There is consensus that risk prediction is increased when measuring HRV during specific provocations such as orthostatic challenge. For the first time, we provide data on reproducibility of such a test in patients with a history of acute coronary syndrome. METHODS: Sixty male patients (65+/-8years) with a history of acute coronary syndrome on stable medication were included. HRV was measured in supine (5min) and standing (5min) position on 2 occasions separated by two weeks. For risk assessment relevant time-domain [standard deviation of all R-R intervals (SDNN) and root mean squared standard differences between adjacent R-R intervals (RMSSD)], frequency domain [low-frequency power (LF), high-frequency power (HF) and LF/HF power ratio] and short-term fractal scaling component (DF1) were computed. Absolute reproducibility was assessed with the standard errors of the mean (SEM) and 95% limits of random variation, and relative reproducibility by the intraclass correlation coefficient (ICC). RESULTS: We found comparable SEMs and ICCs in supine position and after an orthostatic challenge test. All ICCs were good to excellent (ICCs between 0.636 and 0.869). CONCLUSIONS: Reproducibility of HRV parameters during orthostatic challenge is good and comparable with supine position.
Resumo:
Vegetation history for the study region is reconstructed on the basis of pollen, charcoal and AMS14C investigations of lake sediments from Lago del Segrino (calcareous bedrock) and Lago di Muzzano (siliceous bedrock). Late-glacial forests were characterised byBetula andPinus sylvestris. At the beginning of the Holocene they were replaced by temperate continental forest and shrub communities. A special type of temperate lowland forest, withAbies alba as the most important tree, was present in the period 8300 to 4500 B.P. Subsequently,Fagus, Quercus andAlnus glutinosa were the main forest components andA. alba ceased to be of importance.Castanea sativa andJuglans regia were probably introduced after forest clearance by fire during the first century A.D. On soils derived from siliceous bedrock,C. sativa was already dominant at ca. A.D. 200 (A.D. dates are in calendar years). In limestone areas, however,C. sativa failed to achieve a dominant role. After the introduction ofC. sativa, the main trees were initially oak (Quercus spp.) and later the walnut (Juglans regia). Ostrya carpinifolia became the dominant tree around Lago del Segrino only in the last 100–200 years though it had spread into the area at ca. 5000 cal. B.C. This recent expansion ofOstrya is confirmed at other sites and appears to be controlled by human disturbances involving especially clearance. It is argued that these forests should not be regarded as climax communities. It is suggested that under undisturbed succession they would develop into mixed deciduous forests consisting ofFraxinus excelsior, Tilia, Ulmus, Quercus and Acer.
Resumo:
Little is known about the vegetation and fire history of Sardinia, and especially the long-term history of the thermo-Mediterranean belt that encompasses its entire coastal lowlands. A new sedimentary record from a coastal lake based on pollen, spores, macrofossils and microscopic charcoal analysis is used to reconstruct the vegetation and fire history in north-eastern Sardinia. During the mid-Holocene (c. 8,100–5,300 cal bp), the vegetation around Stagno di Sa Curcurica was characterised by dense Erica scoparia and E. arborea stands, which were favoured by high fire activity. Fire incidence declined and evergreen broadleaved forests of Quercus ilex expanded at the beginning of the late Holocene. We relate the observed vegetation and fire dynamics to climatic change, specifically moister and cooler summers and drier and milder winters after 5,300 cal bp. Agricultural activities occurred since the Neolithic and intensified after c. 7,000 cal bp. Around 2,750 cal bp, a further decline of fire incidence and Erica communities occurred, while Quercus ilex expanded and open-land communities became more abundant. This vegetation shift coincided with the historically documented beginning of Phoenician period, which was followed by Punic and Roman civilizations in Sardinia. The vegetational change at around 2,750 cal bp was possibly advantaged by a further shift to moister and cooler summers and drier and milder winters. Triggers for climate changes at 5,300 and 2,750 cal bp may have been gradual, orbitally-induced changes in summer and winter insolation, as well as centennial-scale atmospheric reorganizations. Open evergreen broadleaved forests persisted until the twentieth century, when they were partly substituted by widespread artificial pine plantations. Our results imply that highly flammable Erica vegetation, as reconstructed for the mid-Holocene, could re-emerge as a dominant vegetation type due to increasing drought and fire, as anticipated under global change conditions.
Resumo:
BACKGROUND/AIMS Controversies still exist regarding the evaluation of growth hormone deficiency (GHD) in childhood at the end of growth. The aim of this study was to describe the natural history of GHD in a pediatric cohort. METHODS This is a retrospective study of a cohort of pediatric patients with GHD. Cases of acquired GHD were excluded. Univariate logistic regression was used to identify predictors of GHD persisting into adulthood. RESULTS Among 63 identified patients, 47 (75%) had partial GHD at diagnosis, while 16 (25%) had complete GHD, including 5 with multiple pituitary hormone deficiencies. At final height, 50 patients underwent repeat stimulation testing; 28 (56%) recovered and 22 (44%) remained growth hormone (GH) deficient. Predictors of persisting GHD were: complete GHD at diagnosis (OR 10.1, 95% CI 2.4-42.1), pituitary stalk defect or ectopic pituitary gland on magnetic resonance imaging (OR 6.5, 95% CI 1.1-37.1), greater height gain during GH treatment (OR 1.8, 95% CI 1.0-3.3), and IGF-1 level <-2 standard deviation scores (SDS) following treatment cessation (OR 19.3, 95% CI 3.6-103.1). In the multivariate analysis, only IGF-1 level <-2 SDS (OR 13.3, 95% CI 2.3-77.3) and complete GHD (OR 6.3, 95% CI 1.2-32.8) were associated with the outcome. CONCLUSION At final height, 56% of adolescents with GHD had recovered. Complete GHD at diagnosis, low IGF-1 levels following retesting, and pituitary malformation were strong predictors of persistence of GHD.
Resumo:
PURPOSE The Geographic Atrophy Progression (GAP) study was designed to assess the rate of geographic atrophy (GA) progression and to identify prognostic factors by measuring the enlargement of the atrophic lesions using fundus autofluorescence (FAF) and color fundus photography (CFP). DESIGN Prospective, multicenter, noninterventional natural history study. PARTICIPANTS A total of 603 participants were enrolled in the study; 413 of those had gradable lesion data from FAF or CFP, and 321 had gradable lesion data from both FAF and CFP. METHODS Atrophic lesion areas were measured by FAF and CFP to assess lesion progression over time. Lesion size assessments and best-corrected visual acuity (BCVA) were conducted at screening/baseline (day 0) and at 3 follow-up visits: month 6, month 12, and month 18 (or early exit). MAIN OUTCOME MEASURES The GA lesion progression rate in disease subgroups and mean change from baseline visual acuity. RESULTS Mean (standard error) lesion size changes from baseline, determined by FAF and CFP, respectively, were 0.88 (0.1) and 0.78 (0.1) mm(2) at 6 months, 1.85 (0.1) and 1.57 (0.1) mm(2) at 12 months, and 3.14 (0.4) and 3.17 (0.5) mm(2) at 18 months. The mean change in lesion size from baseline to month 12 was significantly greater in participants who had eyes with multifocal atrophic spots compared with those with unifocal spots (P < 0.001) and those with extrafoveal lesions compared with those with foveal lesions (P = 0.001). The mean (standard deviation) decrease in visual acuity was 6.2 ± 15.6 letters for patients with image data available. Atrophic lesions with a diffuse (mean 0.95 mm(2)) or banded (mean 1.01 mm(2)) FAF pattern grew more rapidly by month 6 compared with those with the "none" (mean, 0.13 mm(2)) and focal (mean, 0.36 mm(2)) FAF patterns. CONCLUSIONS Although differences were observed in mean lesion size measurements using FAF imaging compared with CFP, the measurements were highly correlated with one another. Significant differences were found in lesion progression rates in participants stratified by hyperfluorescence pattern subtype. This large GA natural history study provides a strong foundation for future clinical trials.
Resumo:
This talk consists of two parts - first part deals with what Jeffrey Bolster called “The Changing Nature of Maritime Insurrection”- and I will tie that into some research that I did on a New York clipper ship called the Contest. The second part of the talk will look at the rise of seafarer’s missions and that will tie Roald Kverndal’s magnum opus Seamen’s Missions: Their Origin and Early Growth into some research that I did on the earliest work done in New York among seafarers that pre-dated the American Seamen’s Friend Society.
Resumo:
Background. The association between a prior history of atopy or other autoimmune diseases and risk of alopecia areata is not well established. ^ Objective. Purpose of this study was to use the National Alopecia Areata Registry database to further investigate the association between history of atopy or other autoimmune diseases and risk of alopecia areata. ^ Methods. A total of 2,613 self-registered sporadic cases (n = 2,055) and controls (n = 558) were included in the present analysis. ^ Results. Possessing a history of any atopy (OR = 2.00; 95% CI 1.50-2.54) or autoimmune disease (OR = 1.73; 95% CI 1.10-2.72) was associated with an increased risk of alopecia areata. There was no trend for possessing a history of more than one atopy or autoimmune disease and increasing risk of alopecia areata. ^ Limitations. Recall, reporting, and recruiting bias are potential sources of limitations in this analysis. ^ Conclusion. This analysis revealed that a prior history of atopy and autoimmune disease was associated with an increased risk of alopecia areata and that the results were consistent for both the severe subtype of alopecia areata (i.e., alopecia totalis and alopecia universalis) and the localized subtype (i.e., alopecia areata persistent).^
Resumo:
The natural history of placebo treated travelers' diarrhea and the prognostic factors of recovery from diarrhea were evaluated using 9 groups of placebo treated subjects from 9 clinical trial studies conducted since 1975, for use as a historical control in the future clinical trial of antidiarrheal agents. All of these studies were done by the same group of investigators in one site (Guadalajara, Mexico). The studies are similar in terms of population, measured parameters, microbiologic identification of enteropathogens and definitions of parameters. The studies had two different durations of followup. In some studies, subjects were followed for two days, and in some they were followed for five days.^ Using definitions established by the Infectious Diseases society of America and the Food and Drug Administration, the following efficacy parameters were evaluated: Time to last unformed stool (TLUS), number of unformed stools post-initiation of placebo treatment for five consecutive days of followup, microbiologic cure, and improvement of diarrhea. Among the groups that were followed for five days, the mean TLUS ranged from 59.1 to 83.5 hours. Fifty percent to 78% had diarrhea lasting more than 48 hours and 25% had diarrhea more than five days. The mean number of unformed stools passed on the first day post-initiation of therapy ranged from 3.6 to 5.8 and, for the fifth day ranged from 0.5 to 1.5. By the end of followup, diarrhea improved in 82.6% to 90% of the subjects. Subjects with enterotoxigenic E. coli had 21.6% to 90.0% microbiologic cure; and subjects with shigella species experienced 14.3% to 60.0% microbiologic cure.^ In evaluating the prognostic factors of recovery from diarrhea (primary efficacy parameter in evaluating the efficacy of antidiarrheal agents against travelers' diarrhea). The subjects from five studies were pooled and the Cox proportional hazard model was used to evaluate the predictors of prolonged diarrhea. After adjusting for design characteristics of each trial, fever with a rate ratio (RR) of 0.40, presence of invasive pathogens with a RR of 0.41, presence of severe abdominal pain and cramps with a RR of 0.50, number of watery stools more than five with a RR of 0.60, and presence of non-invasive pathogens with a RR of 0.84 predicted a longer duration of diarrhea. Severe vomiting with a RR of 2.53 predicted a shorter duration of diarrhea. The number of soft stools, presence of fecal leukocytes, presence of nausea, and duration of diarrhea before enrollment were not associated with duration of diarrhea. ^
Resumo:
This talk will outline the history of the doctor-patient relationship in the West. It will touch briefly on medicine in Greek and Roman antiquity, using key texts from Hippocrates and Galen. It will also sketch the changing balance of the religious and the secular in medieval medicine. Finally, it will outline the rise of the modern personal doctor-patient relationship in the 18th century and analyze the chronic dissatisfaction that settled over relations between doctors and patients in the last quarter of the 20th century.