972 resultados para administrative databases of Quebec


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the demanding environment of healthcare reform, reduction of unwanted physician practice variation is promoted, often through evidence-based guidelines. Guidelines represent innovations that direct change(s) in physician practice; however, compliance has been disappointing. Numerous studies have analyzed guideline development and dissemination, while few have evaluated the consequences of guideline adoption. The primary purpose of this study was to explore and analyze the relationship between physician adoption of the glycated hemoglobin test guideline for management of adult patients with diabetes, and the cost of medical care. The study also examined six personal and organizational characteristics of physicians and their association with innovativeness, or adoption of the guideline. ^ Cost was represented by approved charges from a managed care claims database. Total cost, and diabetes and related complications cost, first were compared for all patients of adopter physicians with those of non-adopter physicians. Then, data were analyzed controlling for disease severity based on insulin dependency, and for high cost cases. There was no statistically significant difference in any of eight cost categories analyzed. This study represented a twelve-month period, and did not reflect cost associated with future complications known to result from inadequate management of glycemia. Guideline compliance did not increase annual cost, which, combined with the future benefit of glycemic control, lends support to the cost effectiveness of the guideline in the long term. Physician adoption of the guideline was recommended to reduce the future personal and economic burden of this chronic disease. ^ Only half of physicians studied had adopted the glycated hemoglobin test guideline for at least 75% of their diabetic patients. No statistically significant relationship was found between any physician characteristic and guideline adoption. Instead, it was likely that the innovation-decision process and guideline dissemination methods were most influential. ^ A multidisciplinary, multi-faceted approach, including interventions for each stage of the innovation-decision process, was proposed to diffuse practice guidelines more effectively. Further, it was recommended that Organized Delivery Systems expand existing administrative databases to include clinical information, decision support systems, and reminder mechanisms, to promote and support physician compliance with this and other evidence-based guidelines. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to determine if race/ethnicity was a significant risk factor for hospital mortality in children following congenital heart surgery in a contemporary sample of newborns with congenital heart disease. Unlike previous studies that utilized administrative databases, this study utilized clinical data collected at the point of care to examine racial/ethnic outcome differences in the context of the patients' clinical condition and their overall perioperative experience. A retrospective cohort design was used. The study sample consisted of 316 newborns (<31 days of age) who underwent congenital heart surgery between January 2007 through December 2009. A multivariate logistic regression model was used to determine the impact of race/ethnicity, insurance status, presence of a spatial anomaly, prenatal diagnosis, postoperative sepsis, cardiac arrest, respiratory failure, unplanned reoperation, and total length of stay in the intensive care unit on outcomes following congenital heart surgery in newborns. The study findings showed that the strongest predictors of hospital mortality following congenital heart surgery in this cohort were postoperative cardiac arrest, postoperative respiratory failure, having a spatial anomaly, and total ICU LOS. Race/ethnicity and insurance status were not significant risk factors. The institution where this study was conducted is designated as a center of excellence for congenital heart disease. These centers have state-of-the-art facilities, extensive experience in caring for children with congenital heart disease, and superior outcomes. This study suggests that optimal care delivery for newborns requiring congenital heart surgery at a center of excellence portends exceptional outcomes and this benefit is conferred upon the entire patient population despite the race/ethnicity of the patients. From a public health and health services view, this study also contributes to the overall body of knowledge on racial/ethnic disparities in children with congenital heart defects and puts forward the possibility of a relationship between quality of care and racial/ethnic disparities. Further study is required to examine the impact of race/ethnicity on the long-term outcomes of these children as they encounter the disparate components of the health care delivery system. There is also opportunity to study the role of race/ethnicity on the hospital morbidity in these patients considering current expectations for hospital survival are very high, and much of the current focus for quality improvement rests in minimizing the development of patient morbidities.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction La progression de la maladie rénale chronique (MRC) augmente le risque des maladies cardiovasculaires. L’hypertension, le diabète et la dyslipidémie sont à la fois des facteurs de risque et des comorbidités de la MRC. Chez les individus souffrant de MRC, la persistance et l’observance du traitement de ces facteurs de risque, i.e. le traitement antihypertenseur (TAH), le traitement hypolipémiant (THL) et le traitement antidiabétique (TAD) contribuent à réduire le risque de mortalité et de morbidité cardiovasculaires. Néanmoins, la persistance et l’observance de ces traitements restent encore peu étudiées chez les individus ayant la MRC. Objectifs: Spécifiquement pour chacun des trois traitements (TAH, THL et TAD), une étude de cohorte a été menée dans le but : 1) d’estimer la persistance à prendre le traitement un an après le début du traitement; 2) d’estimer l’observance du traitement au cours de l’année suivant le début du traitement chez les persistants; 3) d’identifier les facteurs associés à la persistance; et 4) d’identifier les facteurs associés à l’observance. Méthodologie: Nous avons utilisé les banques de données administratives de la Régie de l’assurance maladie du Québec (RAMQ) pour mener trois études de cohorte chez les personnes âgées de 18 ans ou plus. Une étude a été conduite chez les individus qui ont commencé un TAH, l’autre conduite chez les patients ayant commencé un THL et la dernière menée chez les nouveaux utilisateurs de TAD. Les individus qui poursuivaient encore leur traitement un an après son début ont été considérés persistants. Parmi les persistants, les patients qui ont eu une proportion de jours couverts (PJC) ≥ 80 % ont été considérés observants. Les facteurs associés à la persistance et ceux associés à l’observance ont été identifiés à l’aide d’une régression de Poisson modifiée. Résultats: Parmi les 7 119 patients ayant débuté un TAH, 78,8 % ont été persistants et 87,7 % des persistants ont été observants. Les individus qui étaient plus susceptibles d’être persistants se trouvaient dans le groupe des utilisateurs de monothérapie d’inhibiteurs de l’enzyme de conversion de l’angiotensine (IECA) (Rapport de prévalences (RP) : 1,20; intervalle de confiance (IC) à 95 % : 1,13-1,27), d’antagonistes du récepteur de l’angiotensine II (ARA) (1,22; 1,14-1,31), de bloquants des canaux calciques (BCC) (1,20; 1,14-1,26), de bêta-bloquants (BB) (1,16; 1,10-1,23) et de multithérapie (1,31; 1,25-1,38) (référence : monothérapie de diurétiques (DIU)). Les individus qui étaient plus susceptibles d’être observants étaient les utilisateurs de monothérapie d’IECA (1,08; 1,03-1,04), de BB (1,10; 1,05-1,15), de BCC (1,10; 1,05-1,15) et de multithérapie. Des 14 607 individus ayant débuté un THL, 80,7 % ont persisté à le prendre; de ces derniers, 88,7 % étaient observants du THL. Les patients qui étaient plus susceptibles d’être persistants étaient ceux ayant un statut socio-économique (SSE) faible (1,03; 1,01-1,06) (référence : SSE élevé) et ceux dont le traitement initial avait été prescrit par un néphrologue (1,06; 1,04-1,09) (référence : omnipraticien). Les individus qui étaient plus susceptibles d’être observants étaient ceux âgés ≥ 66 ans (référence : 18-65) (1,04; 1,01-1,07), ceux ayant un SSE faible (1,08; 1,06-1,10) et ceux qui avaient pris plus de 12 médicaments différents (référence : <7) (1,03; 1.00-1,05). Sur un total de 6 671 individus ayant débuté un TAD, 76,9 % ont persisté à prendre le traitement. Parmi les persistants, 87,9 % étaient observants. Les individus ayant un SSE faible (1,04; 1,01-1,07) (référence : SSE élevé) ou une multithérapie (1,12; 1,08-1,16) (référence : monothérapie de metformine) étaient plus susceptibles d’être persistants, tout comme ceux ayant une comorbidité dont l’hypertension artérielle (1,04; 1,01-1,07), la dyslipidémie (1,06; 1,03-1,10), l’accident vasculaire cérébral (AVC) (1,05; 1,01-1,11) ou la maladie coronarienne (1,03; 1,01-1,06). Les individus plus susceptibles d’être observants étaient ceux ayant un SSE moyen (1,03; 1,01-1,07) ou une multithérapie (1,06; 1,03-1,09). Conclusion: Peu importe le traitement initié par les individus souffrant de MRC, environ 30% des patients ne seraient pas persistants un an après le début du traitement ou observants dans l’année suivant l’initiation. Certains facteurs sont associés de façon consistante à la persistance, par exemple l’AVC, la maladie coronarienne et le nombre de visites médicales, alors que l’âge et le SSE sont associés à l’observance peu importe que le traitement initial soit un TAH, un THL ou un TAD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selenium (Se) is an essential element and deficit or excess of dietary Se is associated with health disorders. Relatively elevated Se levels have been reported in the Brazilian Amazon, where there are also important annual variations in the availability of different foods. The present study was conducted among six riparian communities of the Tapajos River to evaluate seasonal variations in blood and sequential hair cm Se concentrations, and to examine the relationships between Se in blood and hair, and blood and urine. Two cross-sectional studies were conducted, at the descending water (DWS, n = 259) and the rising water (RWS, n = 137) seasons, with repeated measures for a subgroup (n = 112). Blood Se (B-Se), hair Se (H-Se) and urine Se (U-Se) were determined. Match-paired analyses were used for seasonal comparisons and the method of best fit was used to describe the relationships between biomarkers. B-Se levels presented a very large range (142-2447 mu g/l) with no overall seasonal variation (median 284 and 292 mu g/l, respectively). Sequential analysis of 13 cm hair strands showed significant variations over time: Se concentrations at the DWS were significantly lower compared with the rising water season (medians: 0.7 and 0.9 mu g/g; ranges: 0.2-4.3 mu g/g and 0.2-5.4 mu g/g, respectively). At both seasons, the relationships between B-Se and H-Se were linear and highly significant (r(2) = 67.9 and 63.6, respectively), while the relationship between B-Se and U-Se was best described by a sigmoid curve. Gender, age, education and smoking did not influence Se status or biomarker relationships. Variations in H-Se suggest that there may be seasonal availability of Se sources in local food. For populations presenting a large range and/or elevated Se exposure, sequential analyses of H-Se may provide a good reflection of variations in Se status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The explosive growth in biotechnology combined with major advancesin information technology has the potential to radically transformimmunology in the postgenomics era. Not only do we now have readyaccess to vast quantities of existing data, but new data with relevanceto immunology are being accumulated at an exponential rate. Resourcesfor computational immunology include biological databases and methodsfor data extraction, comparison, analysis and interpretation. Publiclyaccessible biological databases of relevance to immunologists numberin the hundreds and are growing daily. The ability to efficientlyextract and analyse information from these databases is vital forefficient immunology research. Most importantly, a new generationof computational immunology tools enables modelling of peptide transportby the transporter associated with antigen processing (TAP), modellingof antibody binding sites, identification of allergenic motifs andmodelling of T-cell receptor serial triggering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate the potential advantages and limitations of the use of the Brazilian hospital admission authorization forms database and the probabilistic record linkage methodology for the validation of reported utilization of hospital care services in household surveys. METHODS: A total of 2,288 households interviews were conducted in the county of Duque de Caxias, Brazil. Information on the occurrence of at least one hospital admission in the year preceding the interview was obtained from a total of 10,733 household members. The 130 records of household members who reported at least one hospital admission in a public hospital were linked to a hospital database with 801,587 records, using an automatic probabilistic approach combined with an extensive clerical review. RESULTS: Seventy-four (57%) of the 130 household members were identified in the hospital database. Yet only 60 subjects (46%) showed a record of hospitalization in the hospital database in the study period. Hospital admissions due to a surgery procedure were significantly more likely to have been identified in the hospital database. The low level of concordance seen in the study can be explained by the following factors: errors in the linkage process; a telescoping effect; and an incomplete record in the hospital database. CONCLUSIONS: The use of hospital administrative databases and probabilistic linkage methodology may represent a methodological alternative for the validation of reported utilization of health care services, but some strategies should be employed in order to minimize the problems related to the use of this methodology in non-ideal conditions. Ideally, a single identifier, such as a personal health insurance number, and the universal coverage of the database would be desirable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The higher education system in Europe is currently under stress and the debates over its reform and future are gaining momentum. Now that, for most countries, we are in a time for change, in the overall society and the whole education system, the legal and political dimensions have gained prominence, which has not been followed by a more integrative approach of the problem of order, its reform and the issue of regulation, beyond the typical static and classical cost-benefit analyses. The two classical approaches for studying (and for designing the policy measures of) the problem of the reform of the higher education system - the cost-benefit analysis and the legal scholarship description - have to be integrated. This is the argument of our paper that the very integration of economic and legal approaches, what Warren Samuels called the legal-economic nexus, is meaningful and necessary, especially if we want to address the problem of order (as formulated by Joseph Spengler) and the overall regulation of the system. On the one hand, and without neglecting the interest and insights gained from the cost-benefit analysis, or other approaches of value for money assessment, we will focus our study on the legal, social and political aspects of the regulation of the higher education system and its reform in Portugal. On the other hand, the economic and financial problems have to be taken into account, but in a more inclusive way with regard to the indirect and other socio-economic costs not contemplated in traditional or standard assessments of policies for the tertiary education sector. In the first section of the paper, we will discuss the theoretical and conceptual underpinning of our analysis, focusing on the evolutionary approach, the role of critical institutions, the legal-economic nexus and the problem of order. All these elements are related to the institutional tradition, from Veblen and Commons to Spengler and Samuels. The second section states the problem of regulation in the higher education system and the issue of policy formulation for tackling the problem. The current situation is clearly one of crisis with the expansion of the cohorts of young students coming to an end and the recurrent scandals in private institutions. In the last decade, after a protracted period of extension or expansion of the system, i. e., the continuous growth of students, universities and other institutions are competing harder to gain students and have seen their financial situation at risk. It seems that we are entering a period of radical uncertainty, higher competition and a new configuration that is slowly building up is the growth in intensity, which means upgrading the quality of the higher learning and getting more involvement in vocational training and life-long learning. With this change, and along with other deep ones in the Portuguese society and economy, the current regulation has shown signs of maladjustment. The third section consists of our conclusions on the current issue of regulation and policy challenge. First, we underline the importance of an evolutionary approach to a process of change that is essentially dynamic. A special attention will be given to the issues related to an evolutionary construe of policy analysis and formulation. Second, the integration of law and economics, through the notion of legal economic nexus, allows us to better define the issues of regulation and the concrete problems that the universities are facing. One aspect is the instability of the political measures regarding the public administration and on which the higher education system depends financially, legally and institutionally, to say the least. A corollary is the lack of clear strategy in the policy reforms. Third, our research criticizes several studies, such as the one made by the OECD in late 2006 for the Ministry of Science, Technology and Higher Education, for being too static and neglecting fundamental aspects of regulation such as the logic of actors, groups and organizations who are major players in the system. Finally, simply changing the legal rules will not necessary per se change the behaviors that the authorities want to change. By this, we mean that it is not only remiss of the policy maker to ignore some of the critical issues of regulation, namely the continuous non-respect by academic management and administrative bodies of universities of the legal rules that were once promulgated. Changing the rules does not change the problem, especially without the necessary debates form the different relevant quarters that make up the higher education system. The issues of social interaction remain as intact. Our treatment of the matter will be organized in the following way. In the first section, the theoretical principles are developed in order to be able to study more adequately the higher education transformation with a modest evolutionary theory and a legal and economic nexus of the interactions of the system and the policy challenges. After describing, in the second section, the recent evolution and current working of the higher education in Portugal, we will analyze the legal framework and the current regulatory practices and problems in light of the theoretical framework adopted. We will end with some conclusions on the current problems of regulation and the policy measures that are discusses in recent years.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This Thesis describes the application of automatic learning methods for a) the classification of organic and metabolic reactions, and b) the mapping of Potential Energy Surfaces(PES). The classification of reactions was approached with two distinct methodologies: a representation of chemical reactions based on NMR data, and a representation of chemical reactions from the reaction equation based on the physico-chemical and topological features of chemical bonds. NMR-based classification of photochemical and enzymatic reactions. Photochemical and metabolic reactions were classified by Kohonen Self-Organizing Maps (Kohonen SOMs) and Random Forests (RFs) taking as input the difference between the 1H NMR spectra of the products and the reactants. The development of such a representation can be applied in automatic analysis of changes in the 1H NMR spectrum of a mixture and their interpretation in terms of the chemical reactions taking place. Examples of possible applications are the monitoring of reaction processes, evaluation of the stability of chemicals, or even the interpretation of metabonomic data. A Kohonen SOM trained with a data set of metabolic reactions catalysed by transferases was able to correctly classify 75% of an independent test set in terms of the EC number subclass. Random Forests improved the correct predictions to 79%. With photochemical reactions classified into 7 groups, an independent test set was classified with 86-93% accuracy. The data set of photochemical reactions was also used to simulate mixtures with two reactions occurring simultaneously. Kohonen SOMs and Feed-Forward Neural Networks (FFNNs) were trained to classify the reactions occurring in a mixture based on the 1H NMR spectra of the products and reactants. Kohonen SOMs allowed the correct assignment of 53-63% of the mixtures (in a test set). Counter-Propagation Neural Networks (CPNNs) gave origin to similar results. The use of supervised learning techniques allowed an improvement in the results. They were improved to 77% of correct assignments when an ensemble of ten FFNNs were used and to 80% when Random Forests were used. This study was performed with NMR data simulated from the molecular structure by the SPINUS program. In the design of one test set, simulated data was combined with experimental data. The results support the proposal of linking databases of chemical reactions to experimental or simulated NMR data for automatic classification of reactions and mixtures of reactions. Genome-scale classification of enzymatic reactions from their reaction equation. The MOLMAP descriptor relies on a Kohonen SOM that defines types of bonds on the basis of their physico-chemical and topological properties. The MOLMAP descriptor of a molecule represents the types of bonds available in that molecule. The MOLMAP descriptor of a reaction is defined as the difference between the MOLMAPs of the products and the reactants, and numerically encodes the pattern of bonds that are broken, changed, and made during a chemical reaction. The automatic perception of chemical similarities between metabolic reactions is required for a variety of applications ranging from the computer validation of classification systems, genome-scale reconstruction (or comparison) of metabolic pathways, to the classification of enzymatic mechanisms. Catalytic functions of proteins are generally described by the EC numbers that are simultaneously employed as identifiers of reactions, enzymes, and enzyme genes, thus linking metabolic and genomic information. Different methods should be available to automatically compare metabolic reactions and for the automatic assignment of EC numbers to reactions still not officially classified. In this study, the genome-scale data set of enzymatic reactions available in the KEGG database was encoded by the MOLMAP descriptors, and was submitted to Kohonen SOMs to compare the resulting map with the official EC number classification, to explore the possibility of predicting EC numbers from the reaction equation, and to assess the internal consistency of the EC classification at the class level. A general agreement with the EC classification was observed, i.e. a relationship between the similarity of MOLMAPs and the similarity of EC numbers. At the same time, MOLMAPs were able to discriminate between EC sub-subclasses. EC numbers could be assigned at the class, subclass, and sub-subclass levels with accuracies up to 92%, 80%, and 70% for independent test sets. The correspondence between chemical similarity of metabolic reactions and their MOLMAP descriptors was applied to the identification of a number of reactions mapped into the same neuron but belonging to different EC classes, which demonstrated the ability of the MOLMAP/SOM approach to verify the internal consistency of classifications in databases of metabolic reactions. RFs were also used to assign the four levels of the EC hierarchy from the reaction equation. EC numbers were correctly assigned in 95%, 90%, 85% and 86% of the cases (for independent test sets) at the class, subclass, sub-subclass and full EC number level,respectively. Experiments for the classification of reactions from the main reactants and products were performed with RFs - EC numbers were assigned at the class, subclass and sub-subclass level with accuracies of 78%, 74% and 63%, respectively. In the course of the experiments with metabolic reactions we suggested that the MOLMAP / SOM concept could be extended to the representation of other levels of metabolic information such as metabolic pathways. Following the MOLMAP idea, the pattern of neurons activated by the reactions of a metabolic pathway is a representation of the reactions involved in that pathway - a descriptor of the metabolic pathway. This reasoning enabled the comparison of different pathways, the automatic classification of pathways, and a classification of organisms based on their biochemical machinery. The three levels of classification (from bonds to metabolic pathways) allowed to map and perceive chemical similarities between metabolic pathways even for pathways of different types of metabolism and pathways that do not share similarities in terms of EC numbers. Mapping of PES by neural networks (NNs). In a first series of experiments, ensembles of Feed-Forward NNs (EnsFFNNs) and Associative Neural Networks (ASNNs) were trained to reproduce PES represented by the Lennard-Jones (LJ) analytical potential function. The accuracy of the method was assessed by comparing the results of molecular dynamics simulations (thermal, structural, and dynamic properties) obtained from the NNs-PES and from the LJ function. The results indicated that for LJ-type potentials, NNs can be trained to generate accurate PES to be used in molecular simulations. EnsFFNNs and ASNNs gave better results than single FFNNs. A remarkable ability of the NNs models to interpolate between distant curves and accurately reproduce potentials to be used in molecular simulations is shown. The purpose of the first study was to systematically analyse the accuracy of different NNs. Our main motivation, however, is reflected in the next study: the mapping of multidimensional PES by NNs to simulate, by Molecular Dynamics or Monte Carlo, the adsorption and self-assembly of solvated organic molecules on noble-metal electrodes. Indeed, for such complex and heterogeneous systems the development of suitable analytical functions that fit quantum mechanical interaction energies is a non-trivial or even impossible task. The data consisted of energy values, from Density Functional Theory (DFT) calculations, at different distances, for several molecular orientations and three electrode adsorption sites. The results indicate that NNs require a data set large enough to cover well the diversity of possible interaction sites, distances, and orientations. NNs trained with such data sets can perform equally well or even better than analytical functions. Therefore, they can be used in molecular simulations, particularly for the ethanol/Au (111) interface which is the case studied in the present Thesis. Once properly trained, the networks are able to produce, as output, any required number of energy points for accurate interpolations.