23 resultados para THEORETICAL BASIS
em Scielo Saúde Pública - SP
Resumo:
There is a great demand for simpler and less costly laboratory techniques and for more accessible procedures for orchid breeders who do not have the necessary theoretical basis to use the traditional seed and clone production methods of orchids in vitro. The aim of this study was to assess the use of sodium hypochlorite (NaClO) as a decontaminant in the process of inoculating adult orchid explants of Arundina bambusifolia and Epidendrum ibaguenses. Solutions of NaClO (1.200, 2.400, 3.600, 4.800 and 6.000 mg L-1 - equivalent to 50, 100, 150, 200 and 250 mL L-1 of commercial bleach - CB) were sprayed on the explants (1.0 mL) and the culture medium (GB5), in the presence or absence of activated charcoal (2 g L-1). The explants used were nodal segments of field-grown adult plants. The procedures for inoculating the explants were conducted outside the laminar flow chamber (LFC), except for the control treatment (autoclaved medium and explant inoculation inside the LFC). The best results for fresh weight yield, height and number of shoots were obtained using NaClO in solution at 1.200 mg L-1 (equivalent to 50 mL L-1 commercial bleach) with activated charcoal in the culture medium. Fresh weight figures were 1.10 g/jar for Arundina bambusifolia and 0.16 g/jar for Epidendrum ibaguenses. Spraying the NaClO solutions controls the contamination of the culture medium already inoculated with the explants.
Resumo:
A recent and comprehensive review of the use of race and ethnicity in research that address health disparities in epidemiology and public health is provided. First it is described the theoretical basis upon which race and ethnicity differ drawing from previous work in anthropology, social science and public health. Second, it is presented a review of 280 articles published in high impacts factor journals in regards to public health and epidemiology from 2009-2011. An analytical grid enabled the examination of conceptual, theoretical and methodological questions related to the use of both concepts. The majority of articles reviewed were grounded in a theoretical framework and provided interpretations from various models. However, key problems identified include a) a failure from researchers to differentiate between the concepts of race and ethnicity; b) an inappropriate use of racial categories to ascribe ethnicity; c) a lack of transparency in the methods used to assess both concepts; and d) failure to address limits associated with the construction of racial or ethnic taxonomies and their use. In conclusion, future studies examining health disparities should clearly establish the distinction between race and ethnicity, develop theoretically driven research and address specific questions about the relationships between race, ethnicity and health. One argue that one way to think about ethnicity, race and health is to dichotomize research into two sets of questions about the relationship between human diversity and health.
Resumo:
Introduction The aim of this study was to explore the environment of Echinococcus granulosus (E. granulosus) protoscolices and their relationship with their host. Methods Proteins from the hydatid-cyst fluid (HCF) from E. granulosus were identified by proteomics. An inductively coupled plasma atomic emission spectrometer (ICP-AES) was used to determine the elements, an automatic biochemical analyzer was used to detect the types and levels of biochemical indices, and an automatic amino acid analyzer was used to detect the types and levels of amino acids in the E. granulosus HCF. Results I) Approximately 30 protein spots and 21 peptide mass fingerprints (PMF) were acquired in the two-dimensional gel electrophoresis (2-DE) pattern of hydatid fluid; II) We detected 10 chemical elements in the cyst fluid, including sodium, potassium, calcium, magnesium, copper, and zinc; III) We measured 19 biochemical metabolites in the cyst fluid, and the amount of most of these metabolites was lower than that in normal human serum; IV) We detected 17 free amino acids and measured some of these, including alanine, glycine, and valine. Conclusions We identified and measured many chemical components of the cyst fluid, providing a theoretical basis for developing new drugs to prevent and treat hydatid disease by inhibiting or blocking nutrition, metabolism, and other functions of the pathogen.
Resumo:
The main object of the present paper consists in giving formulas and methods which enable us to determine the minimum number of repetitions or of individuals necessary to garantee some extent the success of an experiment. The theoretical basis of all processes consists essentially in the following. Knowing the frequency of the desired p and of the non desired ovents q we may calculate the frequency of all possi- ble combinations, to be expected in n repetitions, by expanding the binomium (p-+q)n. Determining which of these combinations we want to avoid we calculate their total frequency, selecting the value of the exponent n of the binomium in such a way that this total frequency is equal or smaller than the accepted limit of precision n/pª{ 1/n1 (q/p)n + 1/(n-1)| (q/p)n-1 + 1/ 2!(n-2)| (q/p)n-2 + 1/3(n-3) (q/p)n-3... < Plim - -(1b) There does not exist an absolute limit of precision since its value depends not only upon psychological factors in our judgement, but is at the same sime a function of the number of repetitions For this reasen y have proposed (1,56) two relative values, one equal to 1-5n as the lowest value of probability and the other equal to 1-10n as the highest value of improbability, leaving between them what may be called the "region of doubt However these formulas cannot be applied in our case since this number n is just the unknown quantity. Thus we have to use, instead of the more exact values of these two formulas, the conventional limits of P.lim equal to 0,05 (Precision 5%), equal to 0,01 (Precision 1%, and to 0,001 (Precision P, 1%). The binominal formula as explained above (cf. formula 1, pg. 85), however is of rather limited applicability owing to the excessive calculus necessary, and we have thus to procure approximations as substitutes. We may use, without loss of precision, the following approximations: a) The normal or Gaussean distribution when the expected frequency p has any value between 0,1 and 0,9, and when n is at least superior to ten. b) The Poisson distribution when the expected frequecy p is smaller than 0,1. Tables V to VII show for some special cases that these approximations are very satisfactory. The praticai solution of the following problems, stated in the introduction can now be given: A) What is the minimum number of repititions necessary in order to avoid that any one of a treatments, varieties etc. may be accidentally always the best, on the best and second best, or the first, second, and third best or finally one of the n beat treatments, varieties etc. Using the first term of the binomium, we have the following equation for n: n = log Riim / log (m:) = log Riim / log.m - log a --------------(5) B) What is the minimun number of individuals necessary in 01der that a ceratin type, expected with the frequency p, may appaer at least in one, two, three or a=m+1 individuals. 1) For p between 0,1 and 0,9 and using the Gaussean approximation we have: on - ó. p (1-p) n - a -1.m b= δ. 1-p /p e c = m/p } -------------------(7) n = b + b² + 4 c/ 2 n´ = 1/p n cor = n + n' ---------- (8) We have to use the correction n' when p has a value between 0,25 and 0,75. The greek letters delta represents in the present esse the unilateral limits of the Gaussean distribution for the three conventional limits of precision : 1,64; 2,33; and 3,09 respectively. h we are only interested in having at least one individual, and m becomes equal to zero, the formula reduces to : c= m/p o para a = 1 a = { b + b²}² = b² = δ2 1- p /p }-----------------(9) n = 1/p n (cor) = n + n´ 2) If p is smaller than 0,1 we may use table 1 in order to find the mean m of a Poisson distribution and determine. n = m: p C) Which is the minimun number of individuals necessary for distinguishing two frequencies p1 and p2? 1) When pl and p2 are values between 0,1 and 0,9 we have: n = { δ p1 ( 1-pi) + p2) / p2 (1 - p2) n= 1/p1-p2 }------------ (13) n (cor) We have again to use the unilateral limits of the Gaussean distribution. The correction n' should be used if at least one of the valors pl or p2 has a value between 0,25 and 0,75. A more complicated formula may be used in cases where whe want to increase the precision : n (p1 - p2) δ { p1 (1- p2 ) / n= m δ = δ p1 ( 1 - p1) + p2 ( 1 - p2) c= m / p1 - p2 n = { b2 + 4 4 c }2 }--------- (14) n = 1/ p1 - p2 2) When both pl and p2 are smaller than 0,1 we determine the quocient (pl-r-p2) and procure the corresponding number m2 of a Poisson distribution in table 2. The value n is found by the equation : n = mg /p2 ------------- (15) D) What is the minimun number necessary for distinguishing three or more frequencies, p2 p1 p3. If the frequecies pl p2 p3 are values between 0,1 e 0,9 we have to solve the individual equations and sue the higest value of n thus determined : n 1.2 = {δ p1 (1 - p1) / p1 - p2 }² = Fiim n 1.2 = { δ p1 ( 1 - p1) + p1 ( 1 - p1) }² } -- (16) Delta represents now the bilateral limits of the : Gaussean distrioution : 1,96-2,58-3,29. 2) No table was prepared for the relatively rare cases of a comparison of threes or more frequencies below 0,1 and in such cases extremely high numbers would be required. E) A process is given which serves to solve two problemr of informatory nature : a) if a special type appears in n individuals with a frequency p(obs), what may be the corresponding ideal value of p(esp), or; b) if we study samples of n in diviuals and expect a certain type with a frequency p(esp) what may be the extreme limits of p(obs) in individual farmlies ? I.) If we are dealing with values between 0,1 and 0,9 we may use table 3. To solve the first question we select the respective horizontal line for p(obs) and determine which column corresponds to our value of n and find the respective value of p(esp) by interpolating between columns. In order to solve the second problem we start with the respective column for p(esp) and find the horizontal line for the given value of n either diretly or by approximation and by interpolation. 2) For frequencies smaller than 0,1 we have to use table 4 and transform the fractions p(esp) and p(obs) in numbers of Poisson series by multiplication with n. Tn order to solve the first broblem, we verify in which line the lower Poisson limit is equal to m(obs) and transform the corresponding value of m into frequecy p(esp) by dividing through n. The observed frequency may thus be a chance deviate of any value between 0,0... and the values given by dividing the value of m in the table by n. In the second case we transform first the expectation p(esp) into a value of m and procure in the horizontal line, corresponding to m(esp) the extreme values om m which than must be transformed, by dividing through n into values of p(obs). F) Partial and progressive tests may be recomended in all cases where there is lack of material or where the loss of time is less importent than the cost of large scale experiments since in many cases the minimun number necessary to garantee the results within the limits of precision is rather large. One should not forget that the minimun number really represents at the same time a maximun number, necessary only if one takes into consideration essentially the disfavorable variations, but smaller numbers may frequently already satisfactory results. For instance, by definition, we know that a frequecy of p means that we expect one individual in every total o(f1-p). If there were no chance variations, this number (1- p) will be suficient. and if there were favorable variations a smaller number still may yield one individual of the desired type. r.nus trusting to luck, one may start the experiment with numbers, smaller than the minimun calculated according to the formulas given above, and increase the total untill the desired result is obtained and this may well b ebefore the "minimum number" is reached. Some concrete examples of this partial or progressive procedure are given from our genetical experiments with maize.
Resumo:
This paper describes the theoretical basis and the experimental requirements for the application of the Taylor dispersion technique for measurements of diffusion coefficients in liquids, emphasizing its simplicity and accuracy in comparison to other usual techniques. Some examples are discussed describing the use of this methodology on studies of solute-solvent interactions, solute aggregation, solute partitioning into macromolecular systems and on the assessment of nanoparticles sizes.
Resumo:
This paper is the first part of an article aimed to present theoretical basis as well as some applications of two infrared reflection techniques: specular reflection and reflection-absorption. It is emphasyzed how the Kramers-Krönig analysis of reflection data can be useful in both retrieving optical constants and making spectral analysis possible. Examples of vitreous, powdered and liquid samples are given.
Resumo:
This paper is the second part of an article aimed to present theoretical basis as well as some applications of two infrared reflection techniques: specular reflection and reflection-absorption. It is emphasyzed how much spectral simulation can aid spectral analysis. The usefulness of reflection-absorption spectroscopy as a thin film caracterization technique is stressed. Optical effects such as LO-TO splittings and their observation as Berreman effect are also addressed.
Resumo:
Since the last decade, the combined use of chemometrics and molecular spectroscopic techniques has become a new alternative for direct drug determination, without the need of physical separation. Among the new methodologies developed, the application of PARAFAC in the decomposition of spectrofluorimetric data should be highlighted. The first objective of this article is to describe the theoretical basis of PARAFAC. For this purpose, a discussion about the order of chemometric methods used in multivariate calibration and the development of multi-dimensional methods is presented first. The other objective of this article is to divulge for the Brazilian chemical community the potential of the combination PARAFAC/spectrofluorimetry for the determination of drugs in complex biological matrices. For this purpose, two applications aiming at determining, respectively, doxorrubicine and salicylate in human plasma are presented.
Resumo:
This work is part of a study that focused on analyzing the contributions of didactic activities related to scientific language rhetoric characteristics aimed at developing students' abilities to identify such characteristics in chemistry scientific texts and critical reading of those texts. In this study, we present the theoretical basis adopted to determine the scientific discourse characteristics and for the production of the didactic material used in those activities. Latour, Coracini and Campanario studies on persuasive rhetorical strategies present in scientific articles aided the production of such material.
Resumo:
The need for effective and reliable quality control in products from pharmaceutical industries renders the analyses of their active ingredients and constituents of great importance. This study presents the theoretical basis of ¹H NMR for quantitative analyses and an example of the method validation according to Resolution RE Nº 899 by the Brazilian National Health Surveillance Agency (ANVISA), in which the compound paracetamol was the active ingredient. All evaluated parameters (selectivity, linearity, accuracy, repeatability and robustness) showed satisfactory results. It was concluded that a single NMR measurement provides structural and quantitative information of active components and excipients in the sample.
Resumo:
Magnesium and its alloys have recently been used in the development of lightweight, biodegradable implant materials. However, the corrosion properties of magnesium limit its clinical application. The purpose of this study was to comprehensively evaluate the degradation behavior and biomechanical properties of magnesium materials treated with micro-arc oxidation (MAO), which is a new promising surface treatment for developing corrosion resistance in magnesium, and to provide a theoretical basis for its further optimization and clinical application. The degradation behavior of MAO-treated magnesium was studied systematically by immersion and electrochemical tests, and its biomechanical performance when exposed to simulated body fluids was evaluated by tensile tests. In addition, the cell toxicity of MAO-treated magnesium samples during the corrosion process was evaluated, and its biocompatibility was investigated under in vivo conditions. The results of this study showed that the oxide coating layers could elevate the corrosion potential of magnesium and reduce its degradation rate. In addition, the MAO-coated sample showed no cytotoxicity and more new bone was formed around it during in vivo degradation. MAO treatment could effectively enhance the corrosion resistance of the magnesium specimen and help to keep its original mechanical properties. The MAO-coated magnesium material had good cytocompatibility and biocompatibility. This technique has an advantage for developing novel implant materials and may potentially be used for future clinical applications.
Resumo:
Macro-analysis, regulation and the method: an alternative to the methodological holism and individualism to a historical and institutionalist macro-economy. The paper examines the epistemological conditions that make the "regulationist" macro-analysis a possible alternative to the traditional equilibrium approaches. It shows how these analyses allow to overcome the structure-agent dilemma as from the concept of contextual rationality and of a hol-individualist methodology that, combined with the notion of strong historicity, find wide theoretical basis in Bourdieu's sociology, in Braudel's works on economical history and in Lukács's ontology on the social being. The paper also explains its historical origins and concludes with a synthesis of the method and the necessary steps to accomplish this type of macroeconomics approach.
Resumo:
Abstract This article presents the increasing demands over the Brazilian Ministry of Foreign Affairs (Itamaraty) for opening its doors to other actors. This discussion will be followed by relevant theoretical and methodological analysis. We will defend the need to overcome problems related to: 1) conceptual vagueness about what the concept of participation means; 2) lack of clarity in the baseline to which comparisons are made; 3) fragile empirical basis; 4) limitations on the use of sources; and 5) how to understand the impact exerted by systemic forces.
Resumo:
INTRODUCTION: The evolution of virulence in host-parasite relationships has been the subject of several publications. In the case of HIV virulence, some authors suggest that the evolution of HIV virulence correlates with the rate of acquisition of new sexual partners. In contrast some other authors argue that the level of HIV virulence is independent of the sexual activity of the host population. METHODS: Provide a mathematical model for the study of the potential influence of human sexual behaviour on the evolution of virulence of HIV is provided. RESULTS: The results indicated that, when the probability of acquisition of infection is a function both of the sexual activity and of the virulence level of HIV strains, the evolution of HIV virulence correlates positively with the rate of acquisition of new sexual partners. CONCLUSION: It is concluded that in the case of a host population with a low (high) rate of exchange of sexual partners the evolution of HIV virulence is such that the less (more) virulent strain prevails.
Resumo:
Theory building is one of the most crucial challenges faced by basic, clinical and population research, which form the scientific foundations of health practices in contemporary societies. The objective of the study is to propose a Unified Theory of Health-Disease as a conceptual tool for modeling health-disease-care in the light of complexity approaches. With this aim, the epistemological basis of theoretical work in the health field and concepts related to complexity theory as concerned to health problems are discussed. Secondly, the concepts of model-object, multi-planes of occurrence, modes of health and disease-illness-sickness complex are introduced and integrated into a unified theoretical framework. Finally, in the light of recent epistemological developments, the concept of Health-Disease-Care Integrals is updated as a complex reference object fit for modeling health-related processes and phenomena.