892 resultados para Medicine, Greek and Roman.
Resumo:
Background: The Institute of Medicine estimates that only a maximum of 25% of clinical research findings are incorporated into practice by physicians. To improve clinical practice, efforts have been made to promote evidence-based medicine and the use of clinical guidelines. Despite these efforts, the gap between research and clinical practice remains wide.^ Objective: To systematically review the literature describing the factors which influence the use of clinical research recommendations by American physicians.^ Hypothesis: Barriers exist in the application of clinical research into clinical practice, and are multifactorial. The establishment of the Clinical and Translational Awards (CTSA; special federal grants awarded to selected institutions to support clinical and translational research) has reduced the effect of these barriers and improved the process of clinical research translation into practice among American physicians.^ Aims: Identify barriers and facilitators of the use of research findings in clinical practice by American physicians. Contrast studies published six years before and after the creation of the CTSA.^ Methods: The sources of data include published literature from Medline, PubMed and PsycINFO. Selected studies must be qualitative, a survey of American clinicians, based on evidence-based medicine practice, clinical guidelines or treatment pathways. Systematic reviews and reports were excluded, as well as studies with less than 100 respondents.^ Results: In total, 1036 abstracts were reviewed; 115 full text potential articles were identified and reviewed, and a total of 31 studies met all criteria for inclusion in the final review.^ Conclusions: The barriers against the application of clinical research findings, in the forms of clinical guidelines, evidence-based medicine guides and clinical pathways, can be divided broadly into physician barriers, practice/system barriers and patient barriers. Physician barriers are the most common barriers, especially the lack of familiarity with guidelines and the lack of time. Of the factors which improve the use of research based guidelines, physician factors such as younger age, lower duration of clinical practice, specialty training, and practice in large group Health Maintenance Organization (HMO) settings with fewer patients seen were the most commonly cited.^
Resumo:
The present study analyzed the differences in distance throwing with heavy and light medicine ball and throwing velocity between handball players of different competitive and professional level. Likewise, the relationship between the three throwing test of progressive specificity was analyzed: throwing with heavy medicinal ball (TH), throwing with light medicinal ball (TL) and throwing velocity (TV). For this purpose, sixty-five professional (P), semiprofessional (S) and non-professional (N) players were evaluated. El presente estudio analizó las diferencias en la distancia de lanzamiento realizado con balón medicinal pesado y ligero y en la velocidad de lanzamiento entre jugadores de balonmano de diferente nivel competitivo y profesional. Igualmente, la relación entre los tres test de lanzamiento, de progresiva especificidad, fue analizado: lanzamiento con balón medicinal pesado (TH), lanzamiento con balón medicinal ligero (TL) y velocidad de lanzamiento (TV). Para ello, sesenta y cinco jugadores profesionales (P), semi-profesionales (S) y no-profesionales (N) fueron evaluados.
Resumo:
La nanotecnología es un área de investigación de reciente creación que trata con la manipulación y el control de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. A escala nanométrica, los materiales exhiben fenómenos físicos, químicos y biológicos singulares, muy distintos a los que manifiestan a escala convencional. En medicina, los compuestos miniaturizados a nanoescala y los materiales nanoestructurados ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, así como una mejora en la focalización del medicamento hacia la diana terapéutica, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales (desde el nivel de población hasta el nivel de célula) y, por tanto, cualquier flujo de trabajo en nanomedicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Desafortunadamente, la informática biomédica todavía no ha proporcionado el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, ni ha adaptado sus métodos y herramientas a este nuevo campo de investigación. En este contexto, la nueva área de la nanoinformática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Las observaciones expuestas previamente determinan el contexto de esta tesis doctoral, la cual se centra en analizar el dominio de la nanomedicina en profundidad, así como en el desarrollo de estrategias y herramientas para establecer correspondencias entre las distintas disciplinas, fuentes de datos, recursos computacionales y técnicas orientadas a la extracción de información y la minería de textos, con el objetivo final de hacer uso de los datos nanomédicos disponibles. El autor analiza, a través de casos reales, alguna de las tareas de investigación en nanomedicina que requieren o que pueden beneficiarse del uso de métodos y herramientas nanoinformáticas, ilustrando de esta forma los inconvenientes y limitaciones actuales de los enfoques de informática biomédica a la hora de tratar con datos pertenecientes al dominio nanomédico. Se discuten tres escenarios diferentes como ejemplos de actividades que los investigadores realizan mientras llevan a cabo su investigación, comparando los contextos biomédico y nanomédico: i) búsqueda en la Web de fuentes de datos y recursos computacionales que den soporte a su investigación; ii) búsqueda en la literatura científica de resultados experimentales y publicaciones relacionadas con su investigación; iii) búsqueda en registros de ensayos clínicos de resultados clínicos relacionados con su investigación. El desarrollo de estas actividades requiere el uso de herramientas y servicios informáticos, como exploradores Web, bases de datos de referencias bibliográficas indexando la literatura biomédica y registros online de ensayos clínicos, respectivamente. Para cada escenario, este documento proporciona un análisis detallado de los posibles obstáculos que pueden dificultar el desarrollo y el resultado de las diferentes tareas de investigación en cada uno de los dos campos citados (biomedicina y nanomedicina), poniendo especial énfasis en los retos existentes en la investigación nanomédica, campo en el que se han detectado las mayores dificultades. El autor ilustra cómo la aplicación de metodologías provenientes de la informática biomédica a estos escenarios resulta efectiva en el dominio biomédico, mientras que dichas metodologías presentan serias limitaciones cuando son aplicadas al contexto nanomédico. Para abordar dichas limitaciones, el autor propone un enfoque nanoinformático, original, diseñado específicamente para tratar con las características especiales que la información presenta a nivel nano. El enfoque consiste en un análisis en profundidad de la literatura científica y de los registros de ensayos clínicos disponibles para extraer información relevante sobre experimentos y resultados en nanomedicina —patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.—, seguido del desarrollo de mecanismos para estructurar y analizar dicha información automáticamente. Este análisis concluye con la generación de un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento y de test anotados manualmente—, el cual ha sido aplicado a la clasificación de registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nanodrogas y nanodispositivos de aquellos enfocados a testear productos farmacéuticos tradicionales. El presente trabajo pretende proporcionar los métodos necesarios para organizar, depurar, filtrar y validar parte de los datos nanomédicos existentes en la actualidad a una escala adecuada para la toma de decisiones. Análisis similares para otras tareas de investigación en nanomedicina ayudarían a detectar qué recursos nanoinformáticos se requieren para cumplir los objetivos actuales en el área, así como a generar conjunto de datos de referencia, estructurados y densos en información, a partir de literatura y otros fuentes no estructuradas para poder aplicar nuevos algoritmos e inferir nueva información de valor para la investigación en nanomedicina. ABSTRACT Nanotechnology is a research area of recent development that deals with the manipulation and control of matter with dimensions ranging from 1 to 100 nanometers. At the nanoscale, materials exhibit singular physical, chemical and biological phenomena, very different from those manifested at the conventional scale. In medicine, nanosized compounds and nanostructured materials offer improved drug targeting and efficacy with respect to traditional formulations, and reveal novel diagnostic and therapeutic properties. Nevertheless, the complexity of information at the nano level is much higher than the complexity at the conventional biological levels (from populations to the cell). Thus, any nanomedical research workflow inherently demands advanced information management. Unfortunately, Biomedical Informatics (BMI) has not yet provided the necessary framework to deal with such information challenges, nor adapted its methods and tools to the new research field. In this context, the novel area of nanoinformatics aims to build new bridges between medicine, nanotechnology and informatics, allowing the application of computational methods to solve informational issues at the wide intersection between biomedicine and nanotechnology. The above observations determine the context of this doctoral dissertation, which is focused on analyzing the nanomedical domain in-depth, and developing nanoinformatics strategies and tools to map across disciplines, data sources, computational resources, and information extraction and text mining techniques, for leveraging available nanomedical data. The author analyzes, through real-life case studies, some research tasks in nanomedicine that would require or could benefit from the use of nanoinformatics methods and tools, illustrating present drawbacks and limitations of BMI approaches to deal with data belonging to the nanomedical domain. Three different scenarios, comparing both the biomedical and nanomedical contexts, are discussed as examples of activities that researchers would perform while conducting their research: i) searching over the Web for data sources and computational resources supporting their research; ii) searching the literature for experimental results and publications related to their research, and iii) searching clinical trial registries for clinical results related to their research. The development of these activities will depend on the use of informatics tools and services, such as web browsers, databases of citations and abstracts indexing the biomedical literature, and web-based clinical trial registries, respectively. For each scenario, this document provides a detailed analysis of the potential information barriers that could hamper the successful development of the different research tasks in both fields (biomedicine and nanomedicine), emphasizing the existing challenges for nanomedical research —where the major barriers have been found. The author illustrates how the application of BMI methodologies to these scenarios can be proven successful in the biomedical domain, whilst these methodologies present severe limitations when applied to the nanomedical context. To address such limitations, the author proposes an original nanoinformatics approach specifically designed to deal with the special characteristics of information at the nano level. This approach consists of an in-depth analysis of the scientific literature and available clinical trial registries to extract relevant information about experiments and results in nanomedicine —textual patterns, common vocabulary, experiment descriptors, characterization parameters, etc.—, followed by the development of mechanisms to automatically structure and analyze this information. This analysis resulted in the generation of a gold standard —a manually annotated training or reference set—, which was applied to the automatic classification of clinical trial summaries, distinguishing studies focused on nanodrugs and nanodevices from those aimed at testing traditional pharmaceuticals. The present work aims to provide the necessary methods for organizing, curating and validating existing nanomedical data on a scale suitable for decision-making. Similar analysis for different nanomedical research tasks would help to detect which nanoinformatics resources are required to meet current goals in the field, as well as to generate densely populated and machine-interpretable reference datasets from the literature and other unstructured sources for further testing novel algorithms and inferring new valuable information for nanomedicine.
Resumo:
La región del espectro electromagnético comprendida entre 100 GHz y 10 THz alberga una gran variedad de aplicaciones en campos tan dispares como la radioastronomía, espectroscopíamolecular, medicina, seguridad, radar, etc. Los principales inconvenientes en el desarrollo de estas aplicaciones son los altos costes de producción de los sistemas trabajando a estas frecuencias, su costoso mantenimiento, gran volumen y baja fiabilidad. Entre las diferentes tecnologías a frecuencias de THz, la tecnología de los diodos Schottky juega un importante papel debido a su madurez y a la sencillez de estos dispositivos. Además, los diodos Schottky pueden operar tanto a temperatura ambiente como a temperaturas criogénicas, con altas eficiencias cuando se usan como multiplicadores y con moderadas temperaturas de ruido en mezcladores. El principal objetivo de esta tesis doctoral es analizar los fenómenos físicos responsables de las características eléctricas y del ruido en los diodos Schottky, así como analizar y diseñar circuitos multiplicadores y mezcladores en bandas milimétricas y submilimétricas. La primera parte de la tesis presenta un análisis de los fenómenos físicos que limitan el comportamiento de los diodos Schottky de GaAs y GaN y de las características del espectro de ruido de estos dispositivos. Para llevar a cabo este análisis, un modelo del diodo basado en la técnica de Monte Carlo se ha considerado como referencia debido a la elevada precisión y fiabilidad de este modelo. Además, el modelo de Monte Carlo permite calcular directamente el espectro de ruido de los diodos sin necesidad de utilizar ningún modelo analítico o empírico. Se han analizado fenómenos físicos como saturación de la velocidad, inercia de los portadores, dependencia de la movilidad electrónica con la longitud de la epicapa, resonancias del plasma y efectos no locales y no estacionarios. También se ha presentado un completo análisis del espectro de ruido para diodos Schottky de GaAs y GaN operando tanto en condiciones estáticas como variables con el tiempo. Los resultados obtenidos en esta parte de la tesis contribuyen a mejorar la comprensión de la respuesta eléctrica y del ruido de los diodos Schottky en condiciones de altas frecuencias y/o altos campos eléctricos. También, estos resultados han ayudado a determinar las limitaciones de modelos numéricos y analíticos usados en el análisis de la respuesta eléctrica y del ruido electrónico en los diodos Schottky. La segunda parte de la tesis está dedicada al análisis de multiplicadores y mezcladores mediante una herramienta de simulación de circuitos basada en la técnica de balance armónico. Diferentes modelos basados en circuitos equivalentes del dispositivo, en las ecuaciones de arrastre-difusión y en la técnica de Monte Carlo se han considerado en este análisis. El modelo de Monte Carlo acoplado a la técnica de balance armónico se ha usado como referencia para evaluar las limitaciones y el rango de validez de modelos basados en circuitos equivalentes y en las ecuaciones de arrastredifusión para el diseño de circuitos multiplicadores y mezcladores. Una notable característica de esta herramienta de simulación es que permite diseñar circuitos Schottky teniendo en cuenta tanto la respuesta eléctrica como el ruido generado en los dispositivos. Los resultados de las simulaciones presentados en esta parte de la tesis, tanto paramultiplicadores comomezcladores, se han comparado con resultados experimentales publicados en la literatura. El simulador que integra el modelo de Monte Carlo con la técnica de balance armónico permite analizar y diseñar circuitos a frecuencias superiores a 1 THz. ABSTRACT The terahertz region of the electromagnetic spectrum(100 GHz-10 THz) presents a wide range of applications such as radio-astronomy, molecular spectroscopy, medicine, security and radar, among others. The main obstacles for the development of these applications are the high production cost of the systems working at these frequencies, highmaintenance, high volume and low reliability. Among the different THz technologies, Schottky technology plays an important rule due to its maturity and the inherent simplicity of these devices. Besides, Schottky diodes can operate at both room and cryogenic temperatures, with high efficiency in multipliers and moderate noise temperature in mixers. This PhD. thesis is mainly concerned with the analysis of the physical processes responsible for the characteristics of the electrical response and noise of Schottky diodes, as well as the analysis and design of frequency multipliers and mixers at millimeter and submillimeter wavelengths. The first part of the thesis deals with the analysis of the physical phenomena limiting the electrical performance of GaAs and GaN Schottky diodes and their noise performance. To carry out this analysis, a Monte Carlo model of the diode has been used as a reference due to the high accuracy and reliability of this diode model at millimeter and submillimter wavelengths. Besides, the Monte Carlo model provides a direct description of the noise spectra of the devices without the necessity of any additional analytical or empirical model. Physical phenomena like velocity saturation, carrier inertia, dependence of the electron mobility on the epilayer length, plasma resonance and nonlocal effects in time and space have been analysed. Also, a complete analysis of the current noise spectra of GaAs and GaN Schottky diodes operating under static and time varying conditions is presented in this part of the thesis. The obtained results provide a better understanding of the electrical and the noise responses of Schottky diodes under high frequency and/or high electric field conditions. Also these results have helped to determine the limitations of numerical and analytical models used in the analysis of the electrical and the noise responses of these devices. The second part of the thesis is devoted to the analysis of frequency multipliers and mixers by means of an in-house circuit simulation tool based on the harmonic balance technique. Different lumped equivalent circuits, drift-diffusion and Monte Carlo models have been considered in this analysis. The Monte Carlo model coupled to the harmonic balance technique has been used as a reference to evaluate the limitations and range of validity of lumped equivalent circuit and driftdiffusion models for the design of frequency multipliers and mixers. A remarkable feature of this reference simulation tool is that it enables the design of Schottky circuits from both electrical and noise considerations. The simulation results presented in this part of the thesis for both multipliers and mixers have been compared with measured results available in the literature. In addition, the Monte Carlo simulation tool allows the analysis and design of circuits above 1 THz.
Resumo:
La nanotecnología es el estudio que la mayoría de veces es tomada como una meta tecnológica que nos ayuda en el área de investigación para tratar con la manipulación y el control en forma precisa de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. Recordando que el prefijo nano proviene del griego vavoc que significa enano y corresponde a un factor de 10^-9, que aplicada a las unidades de longitud corresponde a una mil millonésima parte de un metro. Ahora sabemos que esta ciencia permite trabajar con estructuras moleculares y sus átomos, obteniendo materiales que exhiben fenómenos físicos, químicos y biológicos, muy distintos a los que manifiestan los materiales usados con una longitud mayor. Por ejemplo en medicina, los compuestos manométricos y los materiales nano estructurados muchas veces ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, ya que muchas veces llegan a combinar los antiguos compuestos con estos nuevos para crear nuevas terapias e inclusive han llegado a reemplazarlos, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales y, por tanto, cualquier flujo de trabajo en nano medicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Muchos investigadores en la nanotecnología están buscando la manera de obtener información acerca de estos materiales nanométricos, para mejorar sus estudios que muchas veces lleva a probar estos métodos o crear nuevos compuestos para ayudar a la medicina actual, contra las enfermedades más poderosas como el cáncer. Pero en estos días es muy difícil encontrar una herramienta que les brinde la información específica que buscan en los miles de ensayos clínicos que se suben diariamente en la web. Actualmente, la informática biomédica trata de proporcionar el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, en este contexto, la nueva área de la nano informática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Otro caso en la actualidad es que muchos investigadores de biomedicina desean saber y comparar la información dentro de los ensayos clínicos que contiene temas de nanotecnología en las diferentes paginas en la web por todo el mundo, obteniendo en si ensayos clínicos que se han creado en Norte América, y ensayos clínicos que se han creado en Europa, y saber si en este tiempo este campo realmente está siendo explotado en los dos continentes. El problema es que no se ha creado una herramienta que estime un valor aproximado para saber los porcentajes del total de ensayos clínicos que se han creado en estas páginas web. En esta tesis de fin de máster, el autor utiliza un mejorado pre-procesamiento de texto y un algoritmo que fue determinado como el mejor procesamiento de texto en una tesis doctoral, que incluyo algunas pruebas con muchos de estos para obtener una estimación cercana que ayudaba a diferenciar cuando un ensayo clínico contiene información sobre nanotecnología y cuando no. En otras palabras aplicar un análisis de la literatura científica y de los registros de ensayos clínicos disponibles en los dos continentes para extraer información relevante sobre experimentos y resultados en nano medicina (patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.), seguido el mecanismo de procesamiento para estructurar y analizar dicha información automáticamente. Este análisis concluye con la estimación antes mencionada necesaria para comparar la cantidad de estudios sobre nanotecnología en estos dos continentes. Obviamente usamos un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento anotados manualmente—, y el conjunto de datos para el test es toda la base de datos de estos registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nano drogas, nano dispositivos y nano métodos de aquellos enfocados a testear productos farmacéuticos tradicionales.---ABSTRACT---Nanotechnology is the scientific study that usually is seen as a technological goal that helps us in the investigation field to deal with the manipulation and precise control of the matter with dimensions that range from 1 to 100 nanometers. Remembering that the prefix nano comes from the Greek word νᾶνος, meaning dwarf and denotes a factor of 10^-9, that applyied the longitude units is equal to a billionth of a meter. Now we know that this science allows us to work with molecular structures and their atoms, obtaining material that exhibit physical, chemical and biological phenomena very different to those manifesting in materials with a bigger longitude. As an example in medicine, the nanometric compounds and the materials in nano structures are often offered with more effectiveness regarding to the traditional chemical formulas. This is due to the fact that many occasions combining these old compounds with the new ones, creates new therapies and even replaced them, reveling new diagnostic and therapeutic properties. Even though the complexity of the information at nano level is greater than that in conventional biologic level and, thus, any work flow in nano medicine requires, in an inherent way, advance information management strategies. Many researchers in nanotechnology are looking for a way to obtain information about these nanometric materials to improve their studies that leads in many occasions to prove these methods or to create a new compound that helps modern medicine against powerful diseases, such as cancer. But in these days it is difficult to find a tool that searches and provides a specific information in the thousands of clinic essays that are uploaded daily on the web. Currently, the bio medic informatics tries to provide the work frame that will allow to deal with these information challenge in nano level. In this context, the new area of nano informatics pretends to detect and establish the existing links between medicine, nanotechnology and informatics, encouraging the usage of computational methods to resolve questions and problems that surge with the wide information intersection that is between biomedicine and nanotechnology. Another present case, is that many biomedicine researchers want to know and be able to compare the information inside those clinic essays that contains subjects of nanotechnology on the different webpages across the world, obtaining the clinic essays that has been done in North America and the essays done in Europe, and thus knowing if in this time, this field is really being exploited in both continents. In this master thesis, the author will use an enhanced text pre-processor with an algorithm that was defined as the best text processor in a doctoral thesis, that included many of these tests to obtain a close estimation that helps to differentiate when a clinic essay contains information about nanotechnology and when it does not. In other words, applying an analysis to the scientific literature and clinic essay available in both continents, in order to extract relevant information about experiments and the results in nano-medicine (textual patterns, common vocabulary, experiments descriptors, characterization parameters, etc.), followed by the mechanism process to structure and analyze said information automatically. This analysis concludes with the estimation, mentioned before, needed to compare the quantity of studies about nanotechnology in these two continents. Obviously we use a data reference model (Gold standard) – a set of training data manually annotated –, and the set of data for the test conforms the entire database of these clinic essay registers, allowing to distinguish automatically the studies centered on nano drugs, nano devices and nano methods of those focus on testing traditional pharmaceutical products.
Resumo:
We thank Donna Wallace and the animal house staff for their help with the animal studies. We thank Pat Bain for help in preparing the figures. This work was supported by the Biotechnology and Biological Science Research Council (BBSRC) grant number BB/K001043/1 (G.H., A.W.R., P.N.S., P.J.Mc. and P.J.M.) and the Scottish Government (A.W.R., L.M.T., C.D.M. and P.J.M.).
Resumo:
FUNDING This work was supported by the Biotechnology and Biological Sciences Research Council [BB/I003746/1 to S.H., BB/M001695/1 to S.H and Y.N]
Resumo:
This dissertation examines ancient historiographic citation methodologies in light of Mikhail Bakhtin’s dichotomy between polyphony and monologization. In particular, this dissertation argues that Eusebius of Caesarea’s Historia ecclesiastica (HE) abandons the monologic citation methodology typical of previous Greek and Hellenistic historiography and introduces a polyphonic citation methodology that influences subsequent late-ancient Christian historiography to varying degrees. Whereas Pre-Eusebian Greek and Hellenistic historiographers typically use citations to support the single authorial consciousness of the historiographer, Eusebius uses citations to counterbalance his own shortcomings as a witness to past events. Eusebius allows his citations to retain their own voice, even when they conflict with his. The result is a narrative that transcends the point of view of any single individual and makes multiple witnesses, including the narrator, available to the reader. Post-Eusebian late-ancient Christian historiographers exhibit the influence of Eusebius’ innovation, but they are not as intentional as Eusebius in their use of citation methodologies. Many subsequent Christian historiographers use both monologic and polyphonic citation methodologies. Their tendency to follow Eusebius’ practice of citing numerous lengthy citations sometimes emphasizes points of view that oppose the author’s point of view. When an opposing viewpoint surfaces in enough citations, a polyphonic citation methodology emerges. The reader holds the two different narrative strands in tension as the author continues to give voice to opposing viewpoints. After illustrating the citation methodologies with passages from numerous Greek, Hellenistic, and late ancient Christian historiographers, this dissertation concludes with a short computational analysis that uses natural language processing to reveal some broad trends that highlight the previous findings and suggest a possibility for future research.
Resumo:
Contains notes taken by Harvard student Lyman Spalding from lectures delivered by Hersey Professor of the Theory and Practice of Physic Benjamin Waterhouse (1754-1846) in 1795. The notes cover the history of medicine, theories of contemporary physicians like Herman Boerhaave, William Cullen, and John Brown, and topics like fetal growth, digestion, and circulation. The volume also contains six pages of patient case notes from Spalding’s medical practice in Walpole, New Hampshire, in 1799, which detail the patients’ symptoms and course of treatment he pursued. In the case of a young man who complained of pain in his breast following a wrestling match, Spalding bled him and prescribed a cathartic of soap and aloes. Spalding also operated on a man who cut off part of his ankle with an ax.
Resumo:
Contains notes taken by Harvard student Lyman Spalding (1775-1821) from lectures on anatomy and surgery delivered by Harvard Professor John Warren (1753-1815) in 1795, as well a section entitled “Medical Observations,” which includes entries on “Vernal Debility,” or diseases occurring in the spring, and lung function. It is unclear if these are Spalding’s own writings or transcriptions from a published work. There is also text transcribed from “Elementa Medicinae,” published in 1780 by Scottish physician John Brown.
Resumo:
This layer is a digitized geo-referenced raster image of a 1797 map of Maryland and Delaware drawn by D.F. Sotzmann. These Sotzmann maps (10 maps of New England and Mid-Atlantic states) typically portray both natural and manmade features. They are highly detailed with symbols for churches, roads, court houses, distilleries, iron works, mills, academies, county lines, town lines, and more. Relief is usually indicated by hachures and country boundaries have also been drawn. Place names are shown in both German and English and each map usually includes an index to land grants. Prime meridians used for this series are Greenwich and Washington, D.C.
Resumo:
This paper describes the aggregate rural capital markets of the EU and the main differences between the markets of its member countries. The results of our study suggest that the agricultural credit markets are still quite segmented and the segments are country- rather than currency- or region specific. Financial instability in Europe is also penetrating the agricultural sector and the variation of interest rates for agricultural credit is increasing across countries. Perhaps the most dramatic signal of growing financial instability is that the financial leverage (gearing rate) of European farms rose in 2008 by almost 4 percentage points, from 14 to 18%. The 4 percentage-point annual rise was twice the 2 percentage-point rise observed during the economic recession in the late 1980s and early 1990s. The distribution of the financial leverage of agriculture across countries does not, however, reflect the distribution of country-specific risk premiums in the manner that they are observed in government bond yields. Therefore, in those countries that have the weakest financial situation in the public sector and in which the bond markets are encumbered with high country-specific risk premiums, the agricultural sector is not directly exposed to a very large risk of increasing interest rates, since it is not so highly leveraged. For example in Greek and Spanish agriculture, the financial leverage (gearing) rate is only 0.6% and 2.2% respectively, while the highest gearing rates are found elsewhere (in Denmark), reaching 50%.
Resumo:
When lung development is not interrupted by premature birth and unaffected by genetic or environmental disturbances, all components develop with complex control to form a functional organ with a predictable timeline during fetal development. In this chapter we describe the relationship between morphological development and function in both physiological and pathological conditions in human lung development. Tree-like growth of the lung begins during the first few weeks postconception, with the embryonic stage characterized by branching morphogenesis in both the airways and blood vessels, separately in the left and right lung buds, which appear near day 26 postcoitus (p.c.). Branching continues through the embryonic stage, with proliferation of mesenchymal and epithelial cells and apoptosis near branch points and in the areas of new formation. The pseudoglandular stage (weeks 5–17 p.c.) is characterized by accelerated cellular proliferation and airway and vascular branching, with epithelial differentiation in proximal and distal airways. Further epithelial differentiation, angiogenesis of the parenchymal capillary network, and the first formation of the air–blood barrier characterize the canalicular stage (16–26 weeks p.c.), just before the completion of branching morphogenesis (saccular stage, weeks 24–38 p.c.) and the start of alveolarization (week 36 through adolescence).
Resumo:
Magmatic fluids, heat fluxes, and fluid/rock interactions associated with hydrothermal systems along spreading centers and convergent margins have a significant impact on the genesis of major sulfide deposits and biological communities. Circulation of hydrothermal fluids is one of the most fundamental processes associated with localized mineralization and is controlled by inherent porous and permeable properties of the ocean crust. Heat from magmatic intrusions drives circulation of seawater through permeable portions of the oceanic crust and upper mantle, discharging at the seafloor as both focused high-temperature (250°-400°C) fluids and diffuse lower-temperature (<250°C) fluids. This complex interaction between the circulating hydrothermal fluids and the oceanic basement greatly influences the physical properties and the composition of the crust (Thompson, 1983; Jacobson, 1992, doi:10.1029/91RG02811; Johnson and Semyan, 1994, doi:10.1029/93JB00717). During Ocean Drilling Program (ODP) Leg 193, 13 holes were drilled in the PACMANUS hydrothermal system (Binns, Barriga, Miller, et al., 2002, doi:10.2973/odp.proc.ir.193.2002). The hydrothermal system consists of isolated hydrothermal deposits lined along the main crest of the Pual Ridge, a 500- to 700-m-high felsic neovolcanic ridge in the eastern Manus Basin. The principal drilling targets were the Snowcap (Site 1188) and Roman Ruins (Site 1189) active hydrothermal fields. Samples from these two sites were used for a series of permeability, electrical resistivity, and X-ray computed tomography measurements.
Resumo:
"Biology and Medicine ; Distributed according to TID-4500(15th Ed.)."