977 resultados para analytical approaches
Resumo:
Les progrès de la thérapie antirétrovirale ont transformé l'infection par le VIH d'une condition inévitablement fatale à une maladie chronique. En dépit de ce succès, l'échec thérapeutique et la toxicité médicamenteuse restent fréquents. Une réponse inadéquate au traitement est clairement multifactorielle et une individualisation de la posologie des médicaments qui se baserait sur les facteurs démographiques et génétiques des patients et sur les taux sanguins totaux, libres et/ou cellulaires des médicaments pourrait améliorer à la fois l'efficacité et la tolérance de la thérapie, cette dernière étant certainement un enjeu majeur pour un traitement qui se prend à vie.L'objectif global de cette thèse était de mieux comprendre les facteurs pharmacocinétiques (PK) et pharmacogénétiques (PG) influençant l'exposition aux médicaments antirétroviraux (ARVs) nous offrant ainsi une base rationnelle pour l'optimisation du traitement antiviral et pour l'ajustement posologique des médicaments chez les patients VIH-positifs. Une thérapie antirétrovirale adaptée au patient est susceptible d'augmenter la probabilité d'efficacité et de tolérance à ce traitement, permettant ainsi une meilleure compliance à long terme, et réduisant le risque d'émergence de résistance et d'échec thérapeutique.A cet effet, des méthodes de quantification des concentrations plasmatiques totales, libres et cellulaires des ARVs ainsi que de certains de leurs métabolites ont été développées et validées en utilisant la chromatographie liquide coupée à la spectrométrie de masse en tandem. Ces méthodes ont été appliquées pour la surveillance des taux d'ARVs dans diverses populations de patients HIV-positifs. Une étude clinique a été initiée dans le cadre de l'étude VIH Suisse de cohorte mère-enfant afin de déterminer si la grossesse influence la cinétique des ARVs. Les concentrations totales et libres du lopînavir, de l'atazanavir et de la névirapine ont été déterminées chez les femmes enceintes suivies pendant leur grossesse, et celles-ci ont été trouvées non influencées de manière cliniquement significative par la grossesse. Un ajustement posologique de ces ARVs n'est donc pas nécessaire chez les femmes enceintes. Lors d'une petite étude chez des patients HIV- positifs expérimentés, la corrélation entre l'exposition cellulaire et plasmatique des nouveaux ARVs, notamment le raltégravir, a été déterminée. Une bonne corrélation a été obtenue entre taux plasmatiques et cellulaires de raltégravir, suggérant que la surveillance des taux totaux est un substitut satisfaisant. Cependant, une importante variabilité inter¬patient a été observée dans les ratios d'accumulation cellulaire du raltégravir, ce qui devrait encourager des investigations supplémentaires chez les patients en échec sous ce traitement. L'efficacité du suivi thérapeutique des médicaments (TDM) pour l'adaptation des taux d'efavirenz chez des patients avec des concentrations au-dessus de la cible thérapeutique recommandée a été évaluée lors d'une étude prospective. L'adaptation des doses d'efavirenz basée sur le TDM s'est montrée efficace et sûre, soutenant l'utilisation du TDM chez les patients avec concentrations hors cible thérapeutique. L'impact des polymorphismes génétiques des cytochromes P450 (CYP) 2B6, 2A6 et 3A4/5 sur la pharmacocinétique de l'efavirenz et de ces métabolites a été étudié : un modèle de PK de population intégrant les covariats génétiques et démographiques a été construit. Les variations génétiques fonctionnelles dans les voies de métabolisation principales (CYP2B6) et accessoires {CYP2A6et 3A4/S) de l'efavirenz ont un impact sur sa disposition, et peuvent mener à des expositions extrêmes au médicament. Un? ajustement des doses guidé par le TDM est donc recommandé chez ces patients, en accord avec les polymorphismes génétiques.Ainsi, nous avons démonté qu'en utilisant une approche globale tenant compte à la fois des facteurs PK et PG influençant l'exposition aux ARVs chez les patients infectés, il est possible, si nécessaire, d'individualiser la thérapie antirétrovirale dans des situations diverses. L'optimisation du traitement antirétroviral contribue vraisemblablement à une meilleure efficacité thérapeutique à iong terme tout en réduisant la survenue d'effets indésirables.Résumé grand publicOptimisation de la thérapie antirétrovirale: approches pharmacocinétiques et pharmacogénétiquesLes progrès effectués dans le traitement de l'infection par le virus de llmmunodéficienoe humaine acquise (VIH) ont permis de transformer une affection mortelle en une maladie chronique traitable avec des médicaments de plus en plus efficaces. Malgré ce succès, un certain nombre de patients ne répondent pas de façon optimale à leur traitement etyou souffrent d'effets indésirables médicamenteux entraînant de fréquentes modifications dans leur thérapie. Il a été possible de mettre en évidence que l'efficacité d'un traitement antirétroviral est dans la plupart des cas corrélée aux concentrations de médicaments mesurées dans le sang des patients. Cependant, le virus se réplique dans la cellule, et seule la fraction des médicaments non liée aux protéines du plasma sanguin peut entrer dans la cellule et exercer l'activité antirétrovirale au niveau cellulaire. Il existe par ailleurs une importante variabilité des concentrations sanguines de médicament chez des patients prenant pourtant la même dose de médicament. Cette variabilité peut être due à des facteurs démographiques et/ou génétiques susceptibles d'influencer la réponse au traitement antirétroviral.Cette thèse a eu pour objectif de mieux comprendre les facteurs pharmacologiques et génétiques influençant l'efficacité et ta toxicité des médicaments antirétroviraux, dans le but d'individualiser la thérapie antivirale et d'améliorer le suivi des patients HIV-positifs.A cet effet, des méthodes de dosage très sensibles ont été développées pour permettre la quantification des médicaments antirétroviraux dans le sang et les cellules. Ces méthodes analytiques ont été appliquées dans le cadre de diverses études cliniques réalisées avec des patients. Une des études cliniques a recherché s'il y avait un impact des changements physiologiques liés à la grossesse sur les concentrations des médicaments antirétroviraux. Nous avons ainsi pu démontrer que la grossesse n'influençait pas de façon cliniquement significative le devenir des médicaments antirétroviraux chez les femmes enceintes HIV- positives. La posologie de médicaments ne devrait donc pas être modifiée dans cette population de patientes. Par ailleurs, d'autres études ont portés sur les variations génétiques des patients influençant l'activité enzymatique des protéines impliquées dans le métabolisme des médicaments antirétroviraux. Nous avons également étudié l'utilité d'une surveillance des concentrations de médicament (suivi thérapeutique) dans le sang des patients pour l'individualisation des traitements antiviraux. Il a été possible de mettre en évidence des relations significatives entre l'exposition aux médicaments antirétroviraux et l'existence chez les patients de certaines variations génétiques. Nos analyses ont également permis d'étudier les relations entre les concentrations dans le sang des patients et les taux mesurés dans les cellules où le virus HIV se réplique. De plus, la mesure des taux sanguins de médicaments antirétroviraux et leur interprétation a permis d'ajuster la posologie de médicaments chez les patients de façon efficace et sûre.Ainsi, la complémentarité des connaissances pharmacologiques, génétiques et virales s'inscrit dans l'optique d'une stratégie globale de prise en charge du patient et vise à l'individualisation de la thérapie antirétrovirale en fonction des caractéristiques propres de chaque individu. Cette approche contribue ainsi à l'optimisation du traitement antirétroviral dans la perspective d'un succès du traitement à long terme tout en réduisant la probabilité des effets indésirables rencontrés. - The improvement in antirétroviral therapy has transformed HIV infection from an inevitably fatal condition to a chronic, manageable disease. However, treatment failure and drug toxicity are frequent. Inadequate response to treatment is clearly multifactorial and, therefore, dosage individualisation based on demographic factors, genetic markers and measurement of total, free and/or cellular drug level may increase both drug efficacy and tolerability. Drug tolerability is certainly a major issue for a treatment that must be taken indefinitely.The global objective of this thesis aimed at increasing our current understanding of pharmacokinetic (PK) and pharmacogenetic (PG) factors influencing the exposition to antirétroviral drugs (ARVs) in HIV-positive patients. In turn, this should provide us with a rational basis for antiviral treatment optimisation and drug dosage adjustment in HIV- positive patients. Patient's tailored antirétroviral regimen is likely to enhance treatment effectiveness and tolerability, enabling a better compliance over time, and hence reducing the probability of emergence of viral resistance and treatment failure.To that endeavour, analytical methods for the measurement of total plasma, free and cellular concentrations of ARVs and some of their metabolites have been developed and validated using liquid chromatography coupled with tandem mass spectrometry. These assays have been applied for the monitoring of ARVs levels in various populations of HIV- positive patients. A clinical study has been initiated within the frame of the Mother and Child Swiss HIV Cohort Study to determine whether pregnancy influences the exposition to ARVs. Free and total plasma concentrations of lopinavir, atazanavir and nevirapine have been determined in pregnant women followed during the course of pregnancy, and were found not influenced to a clinically significant extent by pregnancy. Dosage adjustment for these drugs is therefore not required in pregnant women. In a study in treatment- experienced HIV-positive patients, the correlation between cellular and total plasma exposure to new antirétroviral drugs, notably the HIV integrase inhibitor raltegravir, has been determined. A good correlation was obtained between total and cellular levels of raltegravir, suggesting that monitoring of total levels are a satisfactory. However, significant inter-patient variability was observed in raltegravir cell accumulation which should prompt further investigations in patients failing under an integrase inhibitor-based regimen. The effectiveness of therapeutic drug monitoring (TDM) to guide efavirenz dose reduction in patients having concentrations above the recommended therapeutic range was evaluated in a prospective study. TDM-guided dosage adjustment of efavirenz was found feasible and safe, supporting the use of TDM in patients with efavirenz concentrations above therapeutic target. The impact of genetic polymorphisms of cytochromes P450 (CYP) 2B6, 2A6 and 3A4/5 on the PK of efavirenz and its metabolites was studied: a population PK model was built integrating both genetic and demographic covariates. Functional genetic variations in main (CYP2B6) and accessory (2A6, 3A4/5) metabolic pathways of efavirenz have an impact on efavirenz disposition, and may lead to extreme drug exposures. Dosage adjustment guided by TDM is thus required in those patients, according to the pharmacogenetic polymorphism.Thus, we have demonstrated, using a comprehensive approach taking into account both PK and PG factors influencing ARVs exposure in HIV-infected patients, the feasibility of individualising antirétroviral therapy in various situations. Antiviral treatment optimisation is likely to increase long-term treatment success while reducing the occurrence of adverse drug reactions.
Resumo:
Metabolite profiling is critical in many aspects of the life sciences, particularly natural product research. Obtaining precise information on the chemical composition of complex natural extracts (metabolomes) that are primarily obtained from plants or microorganisms is a challenging task that requires sophisticated, advanced analytical methods. In this respect, significant advances in hyphenated chromatographic techniques (LC-MS, GC-MS and LC-NMR in particular), as well as data mining and processing methods, have occurred over the last decade. Together, these tools, in combination with bioassay profiling methods, serve an important role in metabolomics for the purposes of both peak annotation and dereplication in natural product research. In this review, a survey of the techniques that are used for generic and comprehensive profiling of secondary metabolites in natural extracts is provided. The various approaches (chromatographic methods: LC-MS, GC-MS, and LC-NMR and direct spectroscopic methods: NMR and DIMS) are discussed with respect to their resolution and sensitivity for extract profiling. In addition the structural information that can be generated through these techniques or in combination, is compared in relation to the identification of metabolites in complex mixtures. Analytical strategies with applications to natural extracts and novel methods that have strong potential, regardless of how often they are used, are discussed with respect to their potential applications and future trends.
Resumo:
Since the first anti-doping tests in the 1960s, the analytical aspects of the testing remain challenging. The evolution of the analytical process in doping control is discussed in this paper with a particular emphasis on separation techniques, such as gas chromatography and liquid chromatography. These approaches are improving in parallel with the requirements of increasing sensitivity and selectivity for detecting prohibited substances in biological samples from athletes. Moreover, fast analyses are mandatory to deal with the growing number of doping control samples and the short response time required during particular sport events. Recent developments in mass spectrometry and the expansion of accurate mass determination has improved anti-doping strategies with the possibility of using elemental composition and isotope patterns for structural identification. These techniques must be able to distinguish equivocally between negative and suspicious samples with no false-negative or false-positive results. Therefore, high degree of reliability must be reached for the identification of major metabolites corresponding to suspected analytes. Along with current trends in pharmaceutical industry the analysis of proteins and peptides remains an important issue in doping control. Sophisticated analytical tools are still mandatory to improve their distinction from endogenous analogs. Finally, indirect approaches will be discussed in the context of anti-doping, in which recent advances are aimed to examine the biological response of a doping agent in a holistic way.
Resumo:
The safe and responsible development of engineered nanomaterials (ENM), nanotechnology-based materials and products, together with the definition of regulatory measures and implementation of "nano"-legislation in Europe require a widely supported scientific basis and sufficient high quality data upon which to base decisions. At the very core of such a scientific basis is a general agreement on key issues related to risk assessment of ENMs which encompass the key parameters to characterise ENMs, appropriate methods of analysis and best approach to express the effect of ENMs in widely accepted dose response toxicity tests. The following major conclusions were drawn: Due to high batch variability of ENMs characteristics of commercially available and to a lesser degree laboratory made ENMs it is not possible to make general statements regarding the toxicity resulting from exposure to ENMs. 1) Concomitant with using the OECD priority list of ENMs, other criteria for selection of ENMs like relevance for mechanistic (scientific) studies or risk assessment-based studies, widespread availability (and thus high expected volumes of use) or consumer concern (route of consumer exposure depending on application) could be helpful. The OECD priority list is focussing on validity of OECD tests. Therefore source material will be first in scope for testing. However for risk assessment it is much more relevant to have toxicity data from material as present in products/matrices to which men and environment are be exposed. 2) For most, if not all characteristics of ENMs, standardized methods analytical methods, though not necessarily validated, are available. Generally these methods are only able to determine one single characteristic and some of them can be rather expensive. Practically, it is currently not feasible to fully characterise ENMs. Many techniques that are available to measure the same nanomaterial characteristic produce contrasting results (e.g. reported sizes of ENMs). It was recommended that at least two complementary techniques should be employed to determine a metric of ENMs. The first great challenge is to prioritise metrics which are relevant in the assessment of biological dose response relations and to develop analytical methods for characterising ENMs in biological matrices. It was generally agreed that one metric is not sufficient to describe fully ENMs. 3) Characterisation of ENMs in biological matrices starts with sample preparation. It was concluded that there currently is no standard approach/protocol for sample preparation to control agglomeration/aggregation and (re)dispersion. It was recommended harmonization should be initiated and that exchange of protocols should take place. The precise methods used to disperse ENMs should be specifically, yet succinctly described within the experimental section of a publication. 4) ENMs need to be characterised in the matrix as it is presented to the test system (in vitro/ in vivo). 5) Alternative approaches (e.g. biological or in silico systems) for the characterisation of ENMS are simply not possible with the current knowledge. Contributors: Iseult Lynch, Hans Marvin, Kenneth Dawson, Markus Berges, Diane Braguer, Hugh J. Byrne, Alan Casey, Gordon Chambers, Martin Clift, Giuliano Elia1, Teresa F. Fernandes, Lise Fjellsbø, Peter Hatto, Lucienne Juillerat, Christoph Klein, Wolfgang Kreyling, Carmen Nickel1, and Vicki Stone.
Resumo:
The approaches of comparative studies and profile measurements, often used in order to detect post-depositional alterations of ceramics, have been applied simultaneously to two sets of Roman pottery, both of which include altered individuals. As analytical techniques, Neutron Activation Analysis and X-Ray Diffraction have been used. Both approaches lead to substantially different results. This shows that they detect different levels of alteration and should complement each other rather than being used exclusively. For the special process of a glassy phase decomposition followed by a crystallization of the Na-zeolite analcime, the results suggest that it changes high-fired calcareous pottery rapidly, and so fundamentally that the results of various archaeometric techniques can be severely disturbed.
Resumo:
Abstract This paper presents the outcomes from a workshop of the European Network on the Health and Environmental Impact of Nanomaterials (NanoImpactNet). During the workshop, 45 experts in the field of safety assessment of engineered nanomaterials addressed the need to systematically study sets of engineered nanomaterials with specific metrics to generate a data set which would allow the establishment of dose-response relations. The group concluded that international cooperation and worldwide standardization of terminology, reference materials and protocols are needed to make progress in establishing lists of essential metrics. High quality data necessitates the development of harmonized study approaches and adequate reporting of data. Priority metrics can only be based on well-characterized dose-response relations derived from the systematic study of the bio-kinetics and bio-interactions of nanomaterials at both organism and (sub)-cellular levels. In addition, increased effort is needed to develop and validate analytical methods to determine these metrics in a complex matrix.
Resumo:
This review presents the evolution of steroid analytical techniques, including gas chromatography coupled to mass spectrometry (GC-MS), immunoassay (IA) and targeted liquid chromatography coupled to mass spectrometry (LC-MS), and it evaluates the potential of extended steroid profiles by a metabolomics-based approach, namely steroidomics. Steroids regulate essential biological functions including growth and reproduction, and perturbations of the steroid homeostasis can generate serious physiological issues; therefore, specific and sensitive methods have been developed to measure steroid concentrations. GC-MS measuring several steroids simultaneously was considered the first historical standard method for analysis. Steroids were then quantified by immunoassay, allowing a higher throughput; however, major drawbacks included the measurement of a single compound instead of a panel and cross-reactivity reactions. Targeted LC-MS methods with selected reaction monitoring (SRM) were then introduced for quantifying a small steroid subset without the problems of cross-reactivity. The next step was the integration of metabolomic approaches in the context of steroid analyses. As metabolomics tends to identify and quantify all the metabolites (i.e., the metabolome) in a specific system, appropriate strategies were proposed for discovering new biomarkers. Steroidomics, defined as the untargeted analysis of the steroid content in a sample, was implemented in several fields, including doping analysis, clinical studies, in vivo or in vitro toxicology assays, and more. This review discusses the current analytical methods for assessing steroid changes and compares them to steroidomics. Steroids, their pathways, their implications in diseases and the biological matrices in which they are analysed will first be described. Then, the different analytical strategies will be presented with a focus on their ability to obtain relevant information on the steroid pattern. The future technical requirements for improving steroid analysis will also be presented.
Resumo:
A theoretical model for the noise properties of n+nn+ diodes in the drift-diffusion framework is presented. In contrast with previous approaches, our model incorporates both the drift and diffusive parts of the current under inhomogeneous and hot-carrier conditions. Closed analytical expressions describing the transport and noise characteristics of submicrometer n+nn+ diodes, in which the diode base (n part) and the contacts (n+ parts) are coupled in a self-consistent way, are obtained
Resumo:
Through advances in technology, System-on-Chip design is moving towards integrating tens to hundreds of intellectual property blocks into a single chip. In such a many-core system, on-chip communication becomes a performance bottleneck for high performance designs. Network-on-Chip (NoC) has emerged as a viable solution for the communication challenges in highly complex chips. The NoC architecture paradigm, based on a modular packet-switched mechanism, can address many of the on-chip communication challenges such as wiring complexity, communication latency, and bandwidth. Furthermore, the combined benefits of 3D IC and NoC schemes provide the possibility of designing a high performance system in a limited chip area. The major advantages of 3D NoCs are the considerable reductions in average latency and power consumption. There are several factors degrading the performance of NoCs. In this thesis, we investigate three main performance-limiting factors: network congestion, faults, and the lack of efficient multicast support. We address these issues by the means of routing algorithms. Congestion of data packets may lead to increased network latency and power consumption. Thus, we propose three different approaches for alleviating such congestion in the network. The first approach is based on measuring the congestion information in different regions of the network, distributing the information over the network, and utilizing this information when making a routing decision. The second approach employs a learning method to dynamically find the less congested routes according to the underlying traffic. The third approach is based on a fuzzy-logic technique to perform better routing decisions when traffic information of different routes is available. Faults affect performance significantly, as then packets should take longer paths in order to be routed around the faults, which in turn increases congestion around the faulty regions. We propose four methods to tolerate faults at the link and switch level by using only the shortest paths as long as such path exists. The unique characteristic among these methods is the toleration of faults while also maintaining the performance of NoCs. To the best of our knowledge, these algorithms are the first approaches to bypassing faults prior to reaching them while avoiding unnecessary misrouting of packets. Current implementations of multicast communication result in a significant performance loss for unicast traffic. This is due to the fact that the routing rules of multicast packets limit the adaptivity of unicast packets. We present an approach in which both unicast and multicast packets can be efficiently routed within the network. While suggesting a more efficient multicast support, the proposed approach does not affect the performance of unicast routing at all. In addition, in order to reduce the overall path length of multicast packets, we present several partitioning methods along with their analytical models for latency measurement. This approach is discussed in the context of 3D mesh networks.
Resumo:
Biological systems exhibit rich and complex behavior through the orchestrated interplay of a large array of components. It is hypothesized that separable subsystems with some degree of functional autonomy exist; deciphering their independent behavior and functionality would greatly facilitate understanding the system as a whole. Discovering and analyzing such subsystems are hence pivotal problems in the quest to gain a quantitative understanding of complex biological systems. In this work, using approaches from machine learning, physics and graph theory, methods for the identification and analysis of such subsystems were developed. A novel methodology, based on a recent machine learning algorithm known as non-negative matrix factorization (NMF), was developed to discover such subsystems in a set of large-scale gene expression data. This set of subsystems was then used to predict functional relationships between genes, and this approach was shown to score significantly higher than conventional methods when benchmarking them against existing databases. Moreover, a mathematical treatment was developed to treat simple network subsystems based only on their topology (independent of particular parameter values). Application to a problem of experimental interest demonstrated the need for extentions to the conventional model to fully explain the experimental data. Finally, the notion of a subsystem was evaluated from a topological perspective. A number of different protein networks were examined to analyze their topological properties with respect to separability, seeking to find separable subsystems. These networks were shown to exhibit separability in a nonintuitive fashion, while the separable subsystems were of strong biological significance. It was demonstrated that the separability property found was not due to incomplete or biased data, but is likely to reflect biological structure.
Resumo:
Feed samples received by commercial analytical laboratories are often undefined or mixed varieties of forages, originate from various agronomic or geographical areas of the world, are mixtures (e.g., total mixed rations) and are often described incompletely or not at all. Six unified single equation approaches to predict the metabolizable energy (ME) value of feeds determined in sheep fed at maintenance ME intake were evaluated utilizing 78 individual feeds representing 17 different forages, grains, protein meals and by-product feedstuffs. The predictive approaches evaluated were two each from National Research Council [National Research Council (NRC), Nutrient Requirements of Dairy Cattle, seventh revised ed. National Academy Press, Washington, DC, USA, 2001], University of California at Davis (UC Davis) and ADAS (Stratford, UK). Slopes and intercepts for the two ADAS approaches that utilized in vitro digestibility of organic matter and either measured gross energy (GE), or a prediction of GE from component assays, and one UC Davis approach, based upon in vitro gas production and some component assays, differed from both unity and zero, respectively, while this was not the case for the two NRC and one UC Davis approach. However, within these latter three approaches, the goodness of fit (r(2)) increased from the NRC approach utilizing lignin (0.61) to the NRC approach utilizing 48 h in vitro digestion of neutral detergent fibre (NDF:0.72) and to the UC Davis approach utilizing a 30 h in vitro digestion of NDF (0.84). The reason for the difference between the precision of the NRC procedures was the failure of assayed lignin values to accurately predict 48 h in vitro digestion of NDF. However, differences among the six predictive approaches in the number of supporting assays, and their costs, as well as that the NRC approach is actually three related equations requiring categorical description of feeds (making them unsuitable for mixed feeds) while the ADAS and UC Davis approaches are single equations, suggests that the procedure of choice will vary dependent Upon local conditions, specific objectives and the feedstuffs to be evaluated. In contrast to the evaluation of the procedures among feedstuffs, no procedure was able to consistently discriminate the ME values of individual feeds within feedstuffs determined in vivo, suggesting that the quest for an accurate and precise ME predictive approach among and within feeds, may remain to be identified. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
The main aim of this chapter is to offer an overview of research that has adopted the methodology of Corpus Linguistics to study aspects of language use in the media. The overview begins by introducing the key principles and analytical tools adopted in corpus research. To demonstrate the contribution of corpus approaches to media linguistics, a selection of recent corpus studies is subsequently discussed. The final section summarises the strengths and limitations of corpus approaches and discusses avenues for further research.
Resumo:
DANTAS, Rodrigo Assis Neves; NÓBREGA, Walkíria Gomes da; MORAIS FILHO, Luiz Alves; MACÊDO, Eurides Araújo Bezerra de ; FONSECA , Patrícia de Cássia Bezerra; ENDERS, Bertha Cruz; MENEZES, Rejane Maria Paiva de; TORRES , Gilson de Vasconcelos. Paradigms in health care and its relationship to the nursing theories: an analytical test . Revista de Enfermagem UFPE on line. v.4,n.2, p.16-24.abr/jun. 2010. Disponível em < http://www.ufpe.br/revistaenfermagem/index.php/revista>.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)