976 resultados para The aim of belief


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing incidence of ciprofloxacin resistance in Streptococcus pneumoniae may limit the efficacy of the new quinolones in difficult-to-treat infections such as meningitis. The aim of the present study was to determine the efficacy of clinafloxacin alone and in combination with teicoplanin and rifampin in the therapy of ciprofloxacin-susceptible and ciprofloxacin-resistant pneumococcal meningitis in rabbits. When used against a penicillin-resistant ciprofloxacin-susceptible strain (Clinafloxacin MIC 0.12 μg/ml), clinafloxacin at a dose of 20 mg/kg per day b.i.d. decreased bacterial concentration by -5.10 log cfu/ml at 24 hr. Combinations did not improve activity. The same clinafloxacin schedule against a penicillin- and ciprofloxacin-resistant strain (Clinafloxacin MIC 0.5 μg/ml) was totally ineffective. Our data suggest that a moderate decrease in quinolone susceptibility, as indicated by the detection of any degree of ciprofloxacin resistance, may render these antibiotics unsuitable for the management of pneumococcal meningitis

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thanks to the continuous progress made in recent years, medical imaging has become an important tool in the diagnosis of various pathologies. In particular, magnetic resonance imaging (MRI) permits to obtain images with a remarkably high resolution without the use of ionizing radiation and is consequently widely applied for a broad range of conditions in all parts of the body. Contrast agents are used in MRI to improve tissue discrimination. Different categories of contrast agents are clinically available, the most widely used being gadolinium chelates. One can distinguish between extracellular gadolinium chelates such as Gd-DTPA, and hepatobiliary gadolinium chelates such as Gd-BOPTA. The latter are able to enter hepatocytes from where they are partially excreted into the bile to an extent dependent on the contrast agent and animal species. Due to this property, hepatobiliary contrast agents are particularly interesting for the MRI of the liver. Actually, a change in signal intensity can result from a change in transport functions signaling the presence of impaired hepatocytes, e.g. in the case of focal (like cancer) or diffuse (like cirrhosis) liver diseases. Although the excretion mechanism into the bile is well known, the uptake mechanisms of hepatobiliary contrast agents into hepatocytes are still not completely understood and several hypotheses have been proposed. As a good knowledge of these transport mechanisms is required to allow an efficient diagnosis by MRI of the functional state of the liver, more fundamental research is needed and an efficient MRI compatible in vitro model would be an asset. So far, most data concerning these transport mechanisms have been obtained by MRI with in vivo models or by a method of detection other than MRI with cellular or sub-cellular models. Actually, no in vitro model is currently available for the study and quantification of contrast agents by MRI notably because high cellular densities are needed to allow detection, and no metallic devices can be used inside the magnet room, which is incompatible with most tissue or cell cultures that require controlled temperature and oxygenation. The aim of this thesis is thus to develop an MRI compatible in vitro cellular model to study the transport of hepatobiliary contrast agents, in particular Gd-BOPTA, into hepatocytes directly by MRI. A better understanding of this transport and especially of its modification in case of hepatic disorder could permit in a second step to extrapolate this knowledge to humans and to use the kinetics of hepatobiliary contrast agents as a tool for the diagnosis of hepatic diseases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are several alternatives for valuing the future opportunities of firms. The traditional appraisal methods for single projects such as net present value, internalrate of return and payback rules have been criticized in recent years. It has been said that they do not take into account all growth opportunities of firms. At the company level, business valuation is traditionally based on financial and market information. Yield estimates, net worth values and market values of shares are commonly used. Naturally, all valuation methods have their own strengths and shortcomings. In the background of most estimation rules there is the idea that the future of the firms is quite clear and predictable. However, in recent times the business environment of most companies has changed to a more unpredictable direction and the effects of uncertainty have increased. There has been a growing interest in estimating the risks and values of future possibilities. The aim of the current paper is to describe the difference between the value of futureopportunities in information technology firms and forest companies, and also toanalyse the backgrounds for the observed gap.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé : Cette thèse de doctorat est le fruit d'un projet de recherche européen financé par le quatrième programme cadre de la Commission Européenne (DG XII, Standards, Measurement and Testing). Ce projet, dénommé SMT-CT98-2277, a été financé pour la partie suisse par l'Office Fédéral de l'Education et de la Science (OFES, Berne, Suisse). Le but de ce projet était de développer une méthode harmonisée et collaborativement testée pour le profilage des impuretés de l'amphétamine illicite par chromatographie capillaire en phase gazeuse. Le travail a été divisé en sept phases majeures qui concernaient la synthèse de l'amphétamine, l'identification d'impuretés, l'optimisation de la préparation de l'échantillon et du système chromatographique, la variabilité des résultats, l'investigation de méthodes mathématiques pour la classification et la comparaison de profils et finalement l'application de la méthode à des réels échantillons illicites. La méthode résultant de ce travail n'a pas seulement montré que les données étaient interchangeables entre laboratoires mais aussi qu'elle était supérieure en de nombreux points aux méthodes préalablement publiées dans la littérature scientifique. Abstract : This Ph.D. thesis was carried out in parallel to an European project funded by the fourth framework program of the European Commission (DG XII, Standards, Measurement and Testing). This project, named SMT-CT98-2277 was funded, for the Swiss part, by the Federal Office of Education and Science (OFES, Bern, Switzerland). The aim of the project was to develop a harmonised, collaboratively tested method for the impurity profiling of illicit amphetamine by capillary gas chromatography. The work was divided into seven main tasks which deal with the synthesis of amphetamine, identification of impurities, optimization of sample preparation and of the chromatographic system, variability of the results, investigation of numerical methods for the classification and comparison of profiles and finally application of the methodology to real illicit samples. The resulting method has not only shown to produce interchangeable data between different laboratories but was also found to be superior in many aspects to previously published methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this thesis was to produce information for the estimation of the flow balance of wood resin in mechanical pulping and to demonstrate the possibilities for improving the efficiency of deresination in practice. It was observed that chemical changes in wood resin take place only during peroxide bleaching, a significant amount of water dispersed wood resin is retained in the pulp mat during dewatering and the amount of wood resin in the solid phase of the process filtrates is very small. On the basis of this information there exist three parameters related to behaviour of wood resin that determine the flow balance in the process: 1. The liberation of wood resin to the pulp water phase 2. Theretention of water dispersed wood resin in dewatering 3. The proportion of wood resin degraded in the peroxide bleaching The effect of different factors on these parameters was evaluated with the help of laboratory studies and a literature survey. Also, information related to the values of these parameters in existing processes was obtained in mill measurements. With the help of this information, it was possible to evaluate the deresination efficiency and the effect of different factors on this efficiency in a pulping plant that produced low-freeness mechanical pulp. This evaluation showed that the wood resin content of mechanical pulp can be significantly decreased if there exists, in the process, a peroxide bleaching and subsequent washing stage. In the case of an optimal process configuration, as high as a 85 percent deresination efficiency seems to be possible with a water usage level of 8 m3/o.d.t.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Productivity and profitability are important concepts and measures describing the performance and success of a firm. We know that increase in productivity decreases the costs per unit produced and leads to better profitability. This common knowledge is not, however, enough in the modern business environment. Productivity improvement is one means among others for increasing the profitability of actions. There are many means to increase productivity. The use of these means presupposes operative decisions and these decisions presuppose informationabout the effects of these means. Productivity improvement actions are in general made at floor level with machines, cells, activities and human beings. Profitability is most meaningful at the level of the whole firm. It has been very difficult or even impossible to analyze closely enough the economical aspects of thechanges at floor level with the traditional costing systems. New ideas in accounting have only recently brought in elements which make it possible to considerthese phenomena where they actually happen. The aim of this study is to supportthe selection of objects to productivity improvement, and to develop a method to analyze the effects of the productivity change in an activity on the profitability of a firm. A framework for systemizing the economical management of productivity improvement is developed in this study. This framework is a systematical way with two stages to analyze the effects of productivity improvement actions inan activity on the profitability of a firm. At the first stage of the framework, a simple selection method which is based on the worth, possibility and the necessity of the improvement actions in each activity is presented. This method is called Urgency Analysis. In the second stage it is analyzed how much a certain change of productivity in an activity affects the profitability of a firm. A theoretical calculation model with which it is possible to analyze the effects of a productivity improvement in monetary values is presented. On the basis of this theoretical model a tool is made for the analysis at the firm level. The usefulness of this framework was empirically tested with the data of the profit center of one medium size Finnish firm which operates in metal industry. It is expressedthat the framework provides valuable information about the economical effects of productivity improvement for supporting the management in their decision making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the European Union, the importance of mobile communications was realized early on. The process of mobile communications becoming ubiquitous has taken time, as the innovation of mobile communications diffused into the society. The aim of this study is to find out how the evolution and spatial patterns of the diffusion of mobile communications within the European Union could be taken into account in forecasting the diffusion process. There is relatively lot of research of innovation diffusion on the individual (micro) andthe country (macro) level, if compared to the territorial level. Territorial orspatial diffusion refers either to the intra-country or inter-country diffusionof an innovation. In both settings, the dif- fusion of a technological innovation has gained scarce attention. This study adds knowledge of the diffusion between countries, focusing especially on the role of location in this process. The main findings of the study are the following: The penetration rates of the European Union member countries have become more even in the period of observation, from the year 1981 to 2000. The common digital GSM system seems to have hastened this process. As to the role of location in the diffusion process, neighboring countries have had similar diffusion processes. They can be grouped into three, the Nordic countries, the central and southern European countries, and the remote southern European countries. The neighborhood effect is also domi- nating in thegravity model which is used for modeling the adoption timing of the countries. The subsequent diffusion within a country, measured by the logistic model in Finland, is af- fected positively by its economic situation, and it seems to level off at some 92 %. Considering the launch of future mobile communications systemsusing a common standard should implicate an equal development between the countries. The launching time should be carefully selected as the diffusion is probably delayed in economic downturns. The location of a country, measured by distance, can be used in forecasting the adoption and diffusion. Fi- nally, the result of penetration rates becoming more even implies that in a relatively homoge- nous set of countries, such as the European Union member countries, the estimated final pene- tration of a single country can be used for approximating the penetration of the others. The estimated eventual penetration of Finland, some 92 %, should thus also be the eventual level for all the European Union countries and for the European Union as a whole.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water delivered by dental units during routine dental practice is densely contaminated by bacteria. The aim of this study was to determine actual isolation of the microorganisms sprayed from Dental Unit Water Lines (DUWLs) when enrichment cultures are performed and to compare frequencies with those obtained without enrichment cultures. Moreover, the antimicrobial susceptibilities of the microorganisms isolated were also studied. Water samples were collected from one hundred dental equipments in use at Dental Hospital of our University in order to evaluate the presence/absence of microorganisms and to perform their presumptive identification. Aliquots from all of the samples were inoculated in eight different media including both enrichment and selective media. Minimal inhibitory concentrations (MIC) were determined by the broth dilution method. The results herein reported demonstrate that most of the DUWLs were colonized by bacteria from human oral cavity; when enrichment procedures were applied the percentage of DUWLs with detectable human bacteria was one hundred percent. The results showed that in order to evaluate the actual risk of infections spread by DUWLs the inclusion of a step of pre-enrichment should be performed. The need for devices preventing bacterial contamination of DUWLs is a goal to be achieved in the near future that would contribute to maintain safety in dental medical assistance

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent laboratory studies have suggested that heart rate variability (HRV) may be an appropriate criterion for training load (TL) quantification. The aim of this study was to validate a novel HRV index that may be used to assess TL in field conditions. Eleven well-trained long-distance male runners performed four exercises of different duration and intensity. TL was evaluated using Foster and Banister methods. In addition, HRV measurements were performed 5 minutes before exercise and 5 and 30 minutes after exercise. We calculated HRV index (TLHRV) based on the ratio between HRV decrease during exercise and HRV increase during recovery. HRV decrease during exercise was strongly correlated with exercise intensity (R = -0.70; p < 0.01) but not with exercise duration or training volume. TLHRV index was correlated with Foster (R = 0.61; p = 0.01) and Banister (R = 0.57; p = 0.01) methods. This study confirms that HRV changes during exercise and recovery phase are affected by both intensity and physiological impact of the exercise. Since the TLHRV formula takes into account the disturbance and the return to homeostatic balance induced by exercise, this new method provides an objective and rational TL index. However, some simplification of the protocol measurement could be envisaged for field use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to examine the development of the metacognitive knowledge of a group of higher education students who participated actively in an experiment based on a Computer Supported Collaborative Learning environment called KnowCat. Eighteen university students participated in a 12-month learning project during which the KnowCat learning environment was used to support scaffolding process among equals during problem-solving tasks. After using KnowCat, the students were interviewed over their work in this shared workspace. Qualitative analysis revealed that the educational application of KnowCat can favour and improve the development of metacognitive knowledge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a case study of the redesign of the organizational presentation and content of the Virtual Library website at the Universitat Oberta de Catalunya (Open University of Catalonia, UOC), based on a user-centered design strategy. The aim of the redesign was to provide users with more intuitive, usable and understandable content (textual content, resources and services) by implementing criteria of customization, transparency and proximity. The study also presents a selection of best practices for applying these criteria to the design of other library websites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Choosing a substrate is the determinant factor for the seedling producer; thus, the aim of this study was to evaluate the effect of different types of substrates on the emergence of "araticum-de-terra-fria" (Annona emarginata (Schltdl.) H. Rainer) seedlings. The experiment was carried out in a greenhouse and the experimental design was in randomized blocks, with three treatments and five replicates of 72 seeds per plot. The treatments consisted of the following substrates: coconut fiber, vermiculite and Plantmax® Citrus. The number of emerged seedlings was weekly counted for 105 days. Data regarding seedling height were obtained, and the emergence velocity index and mean time, besides total emergence percentage and that over time were calculated. Results from total mean emergence percentage, seedling height, emergence velocity index (EVI), and mean emergence time (MET) were subjected to analysis of variance and means were compared by the Tukey's test at 5% significance. The curves concerning the emergence percentage over time were fit by the logistic growth equation for each treatment and the means of each parameter (A, B, C) were compared by the Duncan's test at 5% significance. The substrates vermiculite led to the highest values of emergence percentage differing from the PlantMax® Citrus, but not of the coconut fiber, however the vermiculite promoted seedling height in a shorter time; therefore, this substrate is recommended for the initial development of "araticum-de-terra-fria" (Annona emarginata (Schltdl.) H. Rainer) seedlings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last economic crisis raised huge challenges for nonprofit organizations. It is now critical for nonprofit organizations to show not only their social legitimacy but also their efficiency and competency to claim for grants (Kearns, Bell, Deem, & McShane, 2012). High Performance Work Practices (HPWP) are a way to foster performance and thus to answer challenges nonprofit organizations are currently facing. However, such practices have until then only been considered for the corporate world. The entire philosophy behind nonprofit organizations contrasts radically from the for-profit sector. Human resources management in particular may differ as well. The aim of this article is precisely to analyze the challenges of implementing HPWP in nonprofit organizations. In order to explore those challenges, we study the HR practices of a nonprofit organization based in UK that struggles against poverty. Discussion of results highlights good practices that should be applied along the nonprofit sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: Routinely collected health data, collected for administrative and clinical purposes, without specific a priori research questions, are increasingly used for observational, comparative effectiveness, health services research, and clinical trials. The rapid evolution and availability of routinely collected data for research has brought to light specific issues not addressed by existing reporting guidelines. The aim of the present project was to determine the priorities of stakeholders in order to guide the development of the REporting of studies Conducted using Observational Routinely-collected health Data (RECORD) statement. METHODS: Two modified electronic Delphi surveys were sent to stakeholders. The first determined themes deemed important to include in the RECORD statement, and was analyzed using qualitative methods. The second determined quantitative prioritization of the themes based on categorization of manuscript headings. The surveys were followed by a meeting of RECORD working committee, and re-engagement with stakeholders via an online commentary period. RESULTS: The qualitative survey (76 responses of 123 surveys sent) generated 10 overarching themes and 13 themes derived from existing STROBE categories. Highest-rated overall items for inclusion were: Disease/exposure identification algorithms; Characteristics of the population included in databases; and Characteristics of the data. In the quantitative survey (71 responses of 135 sent), the importance assigned to each of the compiled themes varied depending on the manuscript section to which they were assigned. Following the working committee meeting, online ranking by stakeholders provided feedback and resulted in revision of the final checklist. CONCLUSIONS: The RECORD statement incorporated the suggestions provided by a large, diverse group of stakeholders to create a reporting checklist specific to observational research using routinely collected health data. Our findings point to unique aspects of studies conducted with routinely collected health data and the perceived need for better reporting of methodological issues.