861 resultados para 370106 Sociological Methodology and Research Methods


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Blowing and drifting of snow is a major concern for transportation efficiency and road safety in regions where their development is common. One common way to mitigate snow drift on roadways is to install plastic snow fences. Correct design of snow fences is critical for road safety and maintaining the roads open during winter in the US Midwest and other states affected by large snow events during the winter season and to maintain costs related to accumulation of snow on the roads and repair of roads to minimum levels. Of critical importance for road safety is the protection against snow drifting in regions with narrow rights of way, where standard fences cannot be deployed at the recommended distance from the road. Designing snow fences requires sound engineering judgment and a thorough evaluation of the potential for snow blowing and drifting at the construction site. The evaluation includes site-specific design parameters typically obtained with semi-empirical relations characterizing the local transport conditions. Among the critical parameters involved in fence design and assessment of their post-construction efficiency is the quantification of the snow accumulation at fence sites. The present study proposes a joint experimental and numerical approach to monitor snow deposits around snow fences, quantitatively estimate snow deposits in the field, asses the efficiency and improve the design of snow fences. Snow deposit profiles were mapped using GPS based real-time kinematic surveys (RTK) conducted at the monitored field site during and after snow storms. The monitored site allowed testing different snow fence designs under close to identical conditions over four winter seasons. The study also discusses the detailed monitoring system and analysis of weather forecast and meteorological conditions at the monitored sites. A main goal of the present study was to assess the performance of lightweight plastic snow fences with a lower porosity than the typical 50% porosity used in standard designs of such fences. The field data collected during the first winter was used to identify the best design for snow fences with a porosity of 50%. Flow fields obtained from numerical simulations showed that the fence design that worked the best during the first winter induced the formation of an elongated area of small velocity magnitude close to the ground. This information was used to identify other candidates for optimum design of fences with a lower porosity. Two of the designs with a fence porosity of 30% that were found to perform well based on results of numerical simulations were tested in the field during the second winter along with the best performing design for fences with a porosity of 50%. Field data showed that the length of the snow deposit away from the fence was reduced by about 30% for the two proposed lower-porosity (30%) fence designs compared to the best design identified for fences with a porosity of 50%. Moreover, one of the lower-porosity designs tested in the field showed no significant snow deposition within the bottom gap region beneath the fence. Thus, a major outcome of this study is to recommend using plastic snow fences with a porosity of 30%. It is expected that this lower-porosity design will continue to work well for even more severe snow events or for successive snow events occurring during the same winter. The approach advocated in the present study allowed making general recommendations for optimizing the design of lower-porosity plastic snow fences. This approach can be extended to improve the design of other types of snow fences. Some preliminary work for living snow fences is also discussed. Another major contribution of this study is to propose, develop protocols and test a novel technique based on close range photogrammetry (CRP) to quantify the snow deposits trapped snow fences. As image data can be acquired continuously, the time evolution of the volume of snow retained by a snow fence during a storm or during a whole winter season can, in principle, be obtained. Moreover, CRP is a non-intrusive method that eliminates the need to perform man-made measurements during the storms, which are difficult and sometimes dangerous to perform. Presently, there is lots of empiricism in the design of snow fences due to lack of data on fence storage capacity on how snow deposits change with the fence design and snow storm characteristics and in the estimation of the main parameters used by the state DOTs to design snow fences at a given site. The availability of such information from CRP measurements should provide critical data for the evaluation of the performance of a certain snow fence design that is tested by the IDOT. As part of the present study, the novel CRP method is tested at several sites. The present study also discusses some attempts and preliminary work to determine the snow relocation coefficient which is one of the main variables that has to be estimated by IDOT engineers when using the standard snow fence design software (Snow Drift Profiler, Tabler, 2006). Our analysis showed that standard empirical formulas did not produce reasonable values when applied at the Iowa test sites monitored as part of the present study and that simple methods to estimate this variable are not reliable. The present study makes recommendations for the development of a new methodology based on Large Scale Particle Image Velocimetry that can directly measure the snow drift fluxes and the amount of snow relocated by the fence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: The aim of this study was to assess the implementation process and economic impact of a new pharmaceutical care service provided since 2002 by pharmacists in Swiss nursing homes. SETTING: The setting was 42 nursing homes located in the canton of Fribourg, Switzerland under the responsibility of 22 pharmacists. METHOD: We developed different facilitators, such as a monitoring system, a coaching program, and a research project, to help pharmacists change their practice and to improve implementation of this new service. We evaluated the implementation rate of the service delivered in nursing homes. We assessed the economic impact of the service since its start in 2002 using statistical evaluation (Chow test) with retrospective analysis of the annual drug costs per resident over an 8-year period (1998-2005). MAIN OUTCOME MEASURES: The description of the facilitators and their implications in implementation of the service; the economic impact of the service since its start in 2002. RESULTS: In 2005, after a 4-year implementation period supported by the introduction of facilitators of practice change, all 42 nursing homes (2,214 residents) had implemented the pharmaceutical care service. The annual drug costs per resident decreased by about 16.4% between 2002 and 2005; this change proved to be highly significant. The performance of the pharmacists continuously improved using a specific coaching program including an annual expert comparative report, working groups, interdisciplinary continuing education symposia, and individual feedback. This research project also determined priorities to develop practice guidelines to prevent drug-related problems in nursing homes, especially in relation to the use of psychotropic drugs. CONCLUSION: The pharmaceutical care service was fully and successfully implemented in Fribourg's nursing homes within a period of 4 years. These findings highlight the importance of facilitators designed to assist pharmacists in the implementation of practice changes. The economic impact was confirmed on a large scale, and priorities for clinical and pharmacoeconomic research were identified in order to continue to improve the quality of integrated care for the elderly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: The Psychiatric arm of the population-based CoLaus study (PsyCoLaus) is designed to: 1) establish the prevalence of threshold and subthreshold psychiatric syndromes in the 35 to 66 year-old population of the city of Lausanne (Switzerland); 2) test the validity of postulated definitions for subthreshold mood and anxiety syndromes; 3) determine the associations between psychiatric disorders, personality traits and cardiovascular diseases (CVD), 4) identify genetic variants that can modify the risk for psychiatric disorders and determine whether genetic risk factors are shared between psychiatric disorders and CVD. This paper presents the method as well as somatic and sociodemographic characteristics of the sample. METHODS: All 35 to 66 year-old persons previously selected for the population-based CoLaus survey on risk factors for CVD were asked to participate in a substudy assessing psychiatric conditions. This investigation included the Diagnostic Interview for Genetic Studies to elicit diagnostic criteria for threshold disorders according to DSM-IV and algorithmically defined subthreshold syndromes. Complementary information was gathered on potential risk and protective factors for psychiatric disorders, migraine and on the morbidity of first-degree family members, whereas the collection of DNA and plasma samples was part of the original somatic study (CoLaus). RESULTS: A total of 3,691 individuals completed the psychiatric evaluation (67% participation). The gender distribution of the sample did not differ significantly from that of the general population in the same age range. Although the youngest 5-year band of the cohort was underrepresented and the oldest 5-year band overrepresented, participants of PsyCoLaus and individuals who refused to participate revealed comparable scores on the General Health Questionnaire, a self-rating instrument completed at the somatic exam. CONCLUSIONS: Despite limitations resulting from the relatively low participation in the context of a comprehensive and time-consuming investigation, the PsyCoLaus study should significantly contribute to the current understanding of psychiatric disorders and comorbid somatic conditions by: 1) establishing the clinical relevance of specific psychiatric syndromes below the DSM-IV threshold; 2) determining comorbidity between risk factors for CVD and psychiatric disorders; 3) assessing genetic variants associated with common psychiatric disorders and 4) identifying DNA markers shared between CVD and psychiatric disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé L'eau est souvent considérée comme une substance ordinaire puisque elle est très commune dans la nature. En fait elle est la plus remarquable de toutes les substances. Sans l'eau la vie sur la terre n'existerait pas. L'eau représente le composant majeur de la cellule vivante, formant typiquement 70 à 95% de la masse cellulaire et elle fournit un environnement à d'innombrables organismes puisque elle couvre 75% de la surface de terre. L'eau est une molécule simple faite de deux atomes d'hydrogène et un atome d'oxygène. Sa petite taille semble en contradiction avec la subtilité de ses propriétés physiques et chimiques. Parmi celles-là, le fait que, au point triple, l'eau liquide est plus dense que la glace est particulièrement remarquable. Malgré son importance particulière dans les sciences de la vie, l'eau est systématiquement éliminée des spécimens biologiques examinés par la microscopie électronique. La raison en est que le haut vide du microscope électronique exige que le spécimen biologique soit solide. Pendant 50 ans la science de la microscopie électronique a adressé ce problème résultant en ce moment en des nombreuses techniques de préparation dont l'usage est courrant. Typiquement ces techniques consistent à fixer l'échantillon (chimiquement ou par congélation), remplacer son contenu d'eau par un plastique doux qui est transformé à un bloc rigide par polymérisation. Le bloc du spécimen est coupé en sections minces (d’environ 50 nm) avec un ultramicrotome à température ambiante. En général, ces techniques introduisent plusieurs artefacts, principalement dû à l'enlèvement d'eau. Afin d'éviter ces artefacts, le spécimen peut être congelé, coupé et observé à basse température. Cependant, l'eau liquide cristallise lors de la congélation, résultant en une importante détérioration. Idéalement, l'eau liquide est solidifiée dans un état vitreux. La vitrification consiste à refroidir l'eau si rapidement que les cristaux de glace n'ont pas de temps de se former. Une percée a eu lieu quand la vitrification d'eau pure a été découverte expérimentalement. Cette découverte a ouvert la voie à la cryo-microscopie des suspensions biologiques en film mince vitrifié. Nous avons travaillé pour étendre la technique aux spécimens épais. Pour ce faire les échantillons biologiques doivent être vitrifiés, cryo-coupées en sections vitreuse et observées dans une cryo-microscope électronique. Cette technique, appelée la cryo- microscopie électronique des sections vitrifiées (CEMOVIS), est maintenant considérée comme étant la meilleure façon de conserver l'ultrastructure de tissus et cellules biologiques dans un état très proche de l'état natif. Récemment, cette technique est devenue une méthode pratique fournissant des résultats excellents. Elle a cependant, des limitations importantes, la plus importante d'entre elles est certainement dû aux artefacts de la coupe. Ces artefacts sont la conséquence de la nature du matériel vitreux et le fait que les sections vitreuses ne peuvent pas flotter sur un liquide comme c'est le cas pour les sections en plastique coupées à température ambiante. Le but de ce travail a été d'améliorer notre compréhension du processus de la coupe et des artefacts de la coupe. Nous avons ainsi trouvé des conditions optimales pour minimiser ou empêcher ces artefacts. Un modèle amélioré du processus de coupe et une redéfinitions des artefacts de coupe sont proposés. Les résultats obtenus sous ces conditions sont présentés et comparés aux résultats obtenus avec les méthodes conventionnelles. Abstract Water is often considered to be an ordinary substance since it is transparent, odourless, tasteless and it is very common in nature. As a matter of fact it can be argued that it is the most remarkable of all substances. Without water life on Earth would not exist. Water is the major component of cells, typically forming 70 to 95% of cellular mass and it provides an environment for innumerable organisms to live in, since it covers 75% of Earth surface. Water is a simple molecule made of two hydrogen atoms and one oxygen atom, H2O. The small size of the molecule stands in contrast with its unique physical and chemical properties. Among those the fact that, at the triple point, liquid water is denser than ice is especially remarkable. Despite its special importance in life science, water is systematically removed from biological specimens investigated by electron microscopy. This is because the high vacuum of the electron microscope requires that the biological specimen is observed in dry conditions. For 50 years the science of electron microscopy has addressed this problem resulting in numerous preparation techniques, presently in routine use. Typically these techniques consist in fixing the sample (chemically or by freezing), replacing its water by plastic which is transformed into rigid block by polymerisation. The block is then cut into thin sections (c. 50 nm) with an ultra-microtome at room temperature. Usually, these techniques introduce several artefacts, most of them due to water removal. In order to avoid these artefacts, the specimen can be frozen, cut and observed at low temperature. However, liquid water crystallizes into ice upon freezing, thus causing severe damage. Ideally, liquid water is solidified into a vitreous state. Vitrification consists in solidifying water so rapidly that ice crystals have no time to form. A breakthrough took place when vitrification of pure water was discovered. Since this discovery, the thin film vitrification method is used with success for the observation of biological suspensions of. small particles. Our work was to extend the method to bulk biological samples that have to be vitrified, cryosectioned into vitreous sections and observed in cryo-electron microscope. This technique is called cryo-electron microscopy of vitreous sections (CEMOVIS). It is now believed to be the best way to preserve the ultrastructure of biological tissues and cells very close to the native state for electron microscopic observation. Since recently, CEMOVIS has become a practical method achieving excellent results. It has, however, some sever limitations, the most important of them certainly being due to cutting artefacts. They are the consequence of the nature of vitreous material and the fact that vitreous sections cannot be floated on a liquid as is the case for plastic sections cut at room temperature. The aim of the present work has been to improve our understanding of the cutting process and of cutting artefacts, thus finding optimal conditions to minimise or prevent these artefacts. An improved model of the cutting process and redefinitions of cutting artefacts are proposed. Results obtained with CEMOVIS under these conditions are presented and compared with results obtained with conventional methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is a broad consensus among economists that technologicalchange has been a major contributor to the productivity growth and, hence, to the growth of the material welfare in western industrialized countries at least over the last century. Paradoxically, this issue has not been the focal point of theoretical economics. At the same time, we have witnessed the rise of the importance of technological issues at the strategic management level of business firms. Interestingly, the research has not accurately responded to this challenge either. The tension between the overwhelming empirical evidence of the importance of technology and its relative omission in the research offers a challenging target for a methodological endeavor. This study deals with the question of how different theories cope with technology and explain technological change. The focusis at the firm level and the analysis concentrates on metatheoretical issues, except for the last two chapters, which examine the problems of strategic management of technology. Here the aim is to build a new evolutionary-based theoreticalframework to analyze innovation processes at the firm level. The study consistsof ten chapters. Chapter 1 poses the research problem and contrasts the two basic approaches, neoclassical and evolutionary, to be analyzed. Chapter 2 introduces the methodological framework which is based on the methodology of isolation. Methodological and ontoogical commitments of the rival approaches are revealed and basic questions concerning their ways of theorizing are elaborated. Chapters 3-6 deal with the so-called substantive isolative criteria. The aim is to examine how different approaches cope with such critical issues as inherent uncertainty and complexity of innovative activities (cognitive isolations, chapter 3), theboundedness of rationality of innovating agents (behavioral isolations, chapter4), the multidimensional nature of technology (chapter 5), and governance costsrelated to technology (chapter 6). Chapters 7 and 8 put all these things together and look at the explanatory structures used by the neoclassical and evolutionary approaches in the light of substantive isolations. The last two cpahters of the study utilize the methodological framework and tools to appraise different economics-based candidates in the context of strategic management of technology. The aim is to analyze how different approaches answer the fundamental question: How can firms gain competitive advantages through innovations and how can the rents appropriated from successful innovations be sustained? The last chapter introduces a new evolutionary-based technology management framework. Also the largely omitted issues of entrepreneurship are examined.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the majority of English language teachers worldwide are non-native English speakers, no research was conducted on these teachers until recently. After the pioneering work of Robert Phillipson in 1992 and Peter Medgyes in 1994, nearly a decade had to elapse for more research to emerge on the issues relating to non-native English teachers. The publication in 1999 of George Braine's book Nonnative educators in English language teaching appears to have encouraged a number of graduate students and scholars to research this issue, with topics ranging from teachers' perceptions of their own identity to students' views and aspects of teacher education. This article compiles, classifies, and examines research conducted in the last two decades on this topic, placing a special emphasis on World Englishes concerns, methods of investigation, and areas in need of further attention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes an evaluation framework that allows a standardized and quantitative comparison of IVUS lumen and media segmentation algorithms. This framework has been introduced at the MICCAI 2011 Computing and Visualization for (Intra)Vascular Imaging (CVII) workshop, comparing the results of eight teams that participated. We describe the available data-base comprising of multi-center, multi-vendor and multi-frequency IVUS datasets, their acquisition, the creation of the reference standard and the evaluation measures. The approaches address segmentation of the lumen, the media, or both borders; semi- or fully-automatic operation; and 2-D vs. 3-D methodology. Three performance measures for quantitative analysis have been proposed. The results of the evaluation indicate that segmentation of the vessel lumen and media is possible with an accuracy that is comparable to manual annotation when semi-automatic methods are used, as well as encouraging results can be obtained also in case of fully-automatic segmentation. The analysis performed in this paper also highlights the challenges in IVUS segmentation that remains to be solved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Throughout history indigo was derived from various plants for example Dyer’s Woad (Isatis tinctoria L.) in Europe. In the 19th century were the synthetic dyes developed and nowadays indigo is mainly synthesized from by-products of fossil fuels. Indigo is a so-called vat dye, which means that it needs to be reduced to its water soluble leucoform before dyeing. Nowadays, most of the industrial reduction is performed chemically by sodium dithionite. However, this is considered environmentally unfavourable because of waste waters contaminating degradation products. Therefore there has been interest to find new possibilities to reduce indigo. Possible alternatives for the application of dithionite as the reducing agent are biologically induced reduction and electrochemical reduction. Glucose and other reducing sugars have recently been suggested as possible environmentally friendly alternatives as reducing agents for sulphur dyes and there have also been interest in using glucose to reduce indigo. In spite of the development of several types of processes, very little is known about the mechanism and kinetics associated with the reduction of indigo. This study aims at investigating the reduction and electrochemical analysis methods of indigo and give insight on the reduction mechanism of indigo. Anthraquinone as well as it’s derivative 1,8-dihydroxyanthraquinone were discovered to act as catalysts for the glucose induced reduction of indigo. Anthraquinone introduces a strong catalytic effect which is explained by invoking a molecular “wedge effect” during co-intercalation of Na+ and anthraquinone into the layered indigo crystal. The study includes also research on the extraction of plant-derived indigo from woad and the examination of the effect of this method to the yield and purity of indigo. The purity has been conventionally studied spectrophotometrically and a new hydrodynamic electrode system is introduced in this study. A vibrating probe is used in following electrochemically the leuco-indigo formation with glucose as a reducing agent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stratospheric ozone can be measured accurately using a limb scatter remote sensing technique at the UV-visible spectral region of solar light. The advantages of this technique includes a good vertical resolution and a good daytime coverage of the measurements. In addition to ozone, UV-visible limb scatter measurements contain information about NO2, NO3, OClO, BrO and aerosols. There are currently several satellite instruments continuously scanning the atmosphere and measuring the UVvisible region of the spectrum, e.g., the Optical Spectrograph and Infrared Imager System (OSIRIS) launched on the Odin satellite in February 2001, and the Scanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY (SCIAMACHY) launched on Envisat in March 2002. Envisat also carries the Global Ozone Monitoring by Occultation of Stars (GOMOS) instrument, which also measures limb-scattered sunlight under bright limb occultation conditions. These conditions occur during daytime occultation measurements. The global coverage of the satellite measurements is far better than any other ozone measurement technique, but still the measurements are sparse in the spatial domain. Measurements are also repeated relatively rarely over a certain area, and the composition of the Earth’s atmosphere changes dynamically. Assimilation methods are therefore needed in order to combine the information of the measurements with the atmospheric model. In recent years, the focus of assimilation algorithm research has turned towards filtering methods. The traditional Extended Kalman filter (EKF) method takes into account not only the uncertainty of the measurements, but also the uncertainty of the evolution model of the system. However, the computational cost of full blown EKF increases rapidly as the number of the model parameters increases. Therefore the EKF method cannot be applied directly to the stratospheric ozone assimilation problem. The work in this thesis is devoted to the development of inversion methods for satellite instruments and the development of assimilation methods used with atmospheric models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Different methods to determine total fat (TF) and fatty acids (FA), including trans fatty acids (TFA), in diverse foodstuffs were evaluated, incorporating gravimetric methods and gas chromatography with flame ionization detector (GC/FID), in accordance with a modified AOAC 996.06 method. Concentrations of TF and FA obtained through these different procedures diverged (p< 0.05) and TFA concentrations varied beyond 20 % of the reference values. The modified AOAC 996.06 method satisfied both accuracy and precision, was fast and employed small amounts of low toxicity solvents. Therefore, the results showed that this methodology is viable to be adopted in Brazil for nutritional labeling purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The results shown in this thesis are based on selected publications of the 2000s decade. The work was carried out in several national and EC funded public research projects and in close cooperation with industrial partners. The main objective of the thesis was to study and quantify the most important phenomena of circulating fluidized bed combustors by developing and applying proper experimental and modelling methods using laboratory scale equipments. An understanding of the phenomena plays an essential role in the development of combustion and emission performance, and the availability and controls of CFB boilers. Experimental procedures to study fuel combustion behaviour under CFB conditions are presented in the thesis. Steady state and dynamic measurements under well controlled conditions were carried out to produce the data needed for the development of high efficiency, utility scale CFB technology. The importance of combustion control and furnace dynamics is emphasized when CFB boilers are scaled up with a once through steam cycle. Qualitative information on fuel combustion characteristics was obtained directly by comparing flue gas oxygen responses during the impulse change experiments with fuel feed. A one-dimensional, time dependent model was developed to analyse the measurement data Emission formation was studied combined with fuel combustion behaviour. Correlations were developed for NO, N2O, CO and char loading, as a function of temperature and oxygen concentration in the bed area. An online method to characterize char loading under CFB conditions was developed and validated with the pilot scale CFB tests. Finally, a new method to control air and fuel feeds in CFB combustion was introduced. The method is based on models and an analysis of the fluctuation of the flue gas oxygen concentration. The effect of high oxygen concentrations on fuel combustion behaviour was also studied to evaluate the potential of CFB boilers to apply oxygenfiring technology to CCS. In future studies, it will be necessary to go through the whole scale up chain from laboratory phenomena devices through pilot scale test rigs to large scale, commercial boilers in order to validate the applicability and scalability of the, results. This thesis shows the chain between the laboratory scale phenomena test rig (bench scale) and the CFB process test rig (pilot). CFB technology has been scaled up successfully from an industrial scale to a utility scale during the last decade. The work shown in the thesis, for its part, has supported the development by producing new detailed information on combustion under CFB conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to evaluate the resolving power of several typing methods to identify relatedness among Brazilian strains of Enterobacter cloacae, we selected twenty isolates from different patients on three wards of a University Hospital (Orthopedics, Nephrology, and Hematology). Traditional phenotyping methods applied to isolates included biotyping, antibiotic sensitivity, phage-typing, and O-serotyping. Plasmid profile analysis, ribotyping, and macrorestriction analysis by pulsed-field gel electrophoresis (PFGE) were used as genotyping methods. Sero- and phage-typing were not useful since the majority of isolates could not be subtyped by these methods. Biotyping, antibiogram and plasmid profile permitted us to classify the samples into different groups depending on the method used, and consequently were not reliable. Ribotyping and PFGE were significantly correlated with the clinical epidemiological analysis. PFGE did not type strains containing nonspecific DNase. Ribotyping was the most discriminative method for typing Brazilian isolates of E. cloacae.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We analyzed the trends of scientific output of the University Hospital, Federal University of Rio de Janeiro. A total of 1420 publications were classified according to pattern and visibility. Most were non-research publications with domestic visibility. With time, there was a tendency to shift from non-research (or education-oriented) publications with domestic visibility to research publications with international visibility. This change may reflect new academic attitudes within the institution concerning the objectives of the hospital and the establishment of scientific research activities. The emphasis of this University Hospital had been on the training of new physicians. However, more recently, the production of new knowledge has been incorporated as a new objective. The analysis of the scientific production of the most productive sectors of the hospital also showed that most are developing non-research studies devoted to the local public while a few of the sectors are carrying out research studies published in journals with international status. The dilemma of quality versus quantity and of education versus research-oriented publication seems, however, to continue to exist within the specialized sectors. The methodology described here to analyze the scientific production of a university hospital can be used as a tool to better understand the evolution of medical research in Brazil and also to help formulate public policies and new strategies to include research among the major objectives of University Hospitals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methods for reliable evaluation of spinal cord (SC) injury in rats at short periods (2 and 24 h) after lesion were tested to characterize the mechanisms implicated in primary SC damage. We measured the physiological changes occurring after several procedures for producing SC injury, with particular emphasis on sensorimotor functions. Segmental and suprasegmental reflexes were tested in 39 male Wistar rats weighing 250-300 g divided into three control groups that were subjected to a) anesthesia, b) dissection of soft prevertebral tissue, and c) laminectomy of the vertebral segments between T10 and L1. In the lesion group the SC was completely transected, hemisected or subjected to vertebral compression. All animals were evaluated 2 and 24 h after the experimental procedure by the hind limb motility index, Bohlman motor score, open-field, hot-plate, tail flick, and paw compression tests. The locomotion scale proved to be less sensitive than the sensorimotor tests. A reduction in exploratory movements was detected in the animals 24 h after the procedures. The hot-plate was the most sensitive test for detecting sensorimotor deficiencies following light, moderate or severe SC injury. The most sensitive and simplest test of reflex function was the hot-plate. The hemisection model promoted reproducible moderate SC injury which allowed us to quantify the resulting behavior and analyze the evolution of the lesion and its consequences during the first 24 h after injury. We conclude that hemisection permitted the quantitation of behavioral responses for evaluation of the development of deficits after lesions. Hind limb evaluation scores and spontaneous exploration events provided a sensitive index of immediate injury effects after SC lesion at 2 and 24 h. Taken together, locomotion scales, open-field, and hot-plate tests represent reproducible, quantitatively sensitive methods for detecting functional deficiencies within short periods of time, indicating their potential for the study of cellular mechanisms of primary injury and repair after traumatic SC injury.