911 resultados para Morrison, Maynard
Resumo:
We present the first measurement of photoproduction of J/psi and of two-photon production of high-mass e(+)e(-) pairs in electromagnetic (or ultra-peripheral) nucleus-nucleus interactions, using Au + Au data at root s(NN) = 200 GeV. The events are tagged with forward neutrons emitted following Coulomb excitation of one or both Au* nuclei. The event sample consists of 28 events with m(e+e-) > 2 GeV/c(2) with zero like-sign background. The measured cross sections at midrapidity of d sigma/dy (J/psi + Xn, y = 0) = 76 +/- 33 (stat) +/- 11 (syst) pb and d(2)sigma /dm dy (e(+) e(-) + Xn, y = 0) = 86 +/- 23(stat) +/- 16(syst) mu b/ (GeV/c(2)) for m(e+e-) epsilon vertical bar 2.0, 2.8 vertical bar GeV/c(2) have been compared and found to be consistent with models for photoproduction of J/psi and QED based calculations of two-photon production of e(+)e(-) pairs. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
PHENIX has measured the electron-positron pair mass spectrum from 0 to 8 GeV/c(2) in p + p collisions at root s = 200 GeV. The contributions from light meson decays to e(+)e(-) pairs have been determined based on measurements of hadron production cross sections by PHENIX. Within the systematic uncertainty of similar to 20% they account for all e(+)e(-) pairs in the mass region below similar to 1 GeV/c(2). The e(+)e(-) pair yield remaining after subtracting these contributions is dominated by semileptonic decays of charmed hadrons correlated through flavor conservation. Using the spectral shape predicted by PYTHIA, we estimate the charm production cross section to be 544 +/- 39(stat) +/- 142(syst) +/- 200(model) pb. which is consistent with QCD calculations and measurements of single leptons by PHENIX. (C) 2008 Elsevier BV. All rights reserved.
Resumo:
Oxidized bases are common types of DNA modifications. Their accumulation in the genome is linked to aging and degenerative diseases. These modifications are commonly repaired by the base excision repair (BER) pathway. Oxoguanine DNA glycosylase (OGG1) initiates BER of oxidized purine bases. A small number of protein interactions have been identified for OGG1, while very few appear to have functional consequences. We report here that OGG1 interacts with the recombination protein RAD52 in vitro and in vivo. This interaction has reciprocal functional consequences as OGG1 inhibits RAD52 catalytic activities and RAD52 stimulates OGG1 incision activity, likely increasing its turnover rate. RAD52 colocalizes with OGG1 after oxidative stress to cultured cells, but not after the direct induction of double-strand breaks by ionizing radiation. Human cells depleted of RAD52 via small interfering RNA knockdown, and mouse cells lacking the protein via gene knockout showed increased sensitivity to oxidative stress. Moreover, cells depleted of RAD52 show higher accumulation of oxidized bases in their genome than cells with normal levels of RAD52. Our results indicate that RAD52 cooperates with OGG1 to repair oxidative DNA damage and enhances the cellular resistance to oxidative stress. Our observations suggest a coordinated action between these proteins that may be relevant when oxidative lesions positioned close to strand breaks impose a hindrance to RAD52 catalytic activities.
Resumo:
Mitochondrial transcription factor A (TFAM) is an essential component of mitochondrial nucleoids TFAM plays an important role in mitochondrial transcription and replication TFAM has been previously reported to inhibit nucleotide excision repair (NER) in vitro but NER has not yet been detected in mitochondria, whereas base excision repair (BER) has been comprehensively characterized in these organelles The BER proteins are associated with the inner membrane in mitochondria and thus with the mitochondrial nucleoid, where TFAM is also situated However, a function for TFAM in BER has not yet been investigated This study examines the role of TFAM in BER In vitro studies with purified recombinant TFAM indicate that it preferentially binds to DNA containing 8-oxoguanines, but not to abasic sites, uracils, or a gap in the sequence TFAM inhibited the in vitro incision activity of 8-oxoguanine DNA glycosylase (OGG1), uracil-DNA glycosylase (UDG), apurinic endonuclease 1 (APE1), and nucleotide incorporation by DNA polymerase gamma (pol gamma) On the other hand, a DNA binding-defective TFAM mutant, L58A, showed less inhibition of BER in vitro Characterization of TFAM knockdown (KD) cells revealed that these lysates had higher 8oxoG incision activity without changes in alpha OGG1 protein levels TFAM KD cells had mild resistance to menadione and increased damage accumulation in the mtDNA when compared to the control cells In addition, we found that the tumor suppressor p53, which has been shown to interact with and alter the DNA binding activity of TFAM, alleviates TFAM-Induced inhibition of BER proteins Together, the results suggest that TFAM modulates BER in mitochondria by virtue of its DNA binding activity and protein interactions Published by Elsevier B V
Resumo:
Cedric Gael Bryant, Lee Family Professor of English, reading Beloved: A Novel, by Toni Morrison (PS3563 O8749 B4 1987)
Resumo:
Este ensaio se propõe confrontar a abordagem neoclássica da economia como ciência positiva, com a modelo keynesiano e a visão da economia dinâmica de Kalecki, onde a economia é tratada desde a perspectiva de uma ciência moral e normativa. Para tanto analisaremos as bases teóricas de cada modelo, seus pressupostos, leis fundamentais e principais conclusões. Dado o propósito didático do texto nos preocupamos em tentar explicar os antecedentes, axiomas, leis e relações funcionais de cada modelo, dando especial ênfase às que surgem da crítica de postulados anteriores, pois admitimos que cada modelo incorpora valores, pressupostos e metodologia própria, cuja crítica é essencial para o avanço da ciência. A economia neoclássica supõe agentes racionais, informação completa e ações e resultados imediatos. Seu método de análise é a otimização com restrições. O principio ordenador, necessário e suficiente da atividade econômica, consiste no comportamento racional dos agentes. Este modelo tem sua concepção política e ética das relações econômicas, consistente com seus pressupostos, o que fica patente, por exemplo, a propósito de sua teoria da distribuição da renda. Com a introdução de conceitos como: o tempo histórico; o caracter monetário da produção; a preferência pela liquidez; o comportamento subjetivo dos agentes; o predomínio da procura sobre a oferta; as expectativas e a incerteza em relação ao futuro, etc., a macroeconomia de Keynes consegue romper o paradigma anterior, do ajuste automático dos mercados de acordo com um feedeback contínuo e estável, capaz de garantir o equilíbrio de pleno emprego. Embora a análise keynesiana tivesse permitido a abordagem precisa de questões tão importantes como: a natureza e as causas do desemprego; o papel da moeda e do crédito; a determinação dos juros; os condicionantes do investimento, etc., faltava ainda uma teoria dos preços, da distribuição e do ciclo econômico, no que o trabalho de M. Kalecki, certamente parece ter avançado. Este autor parte de um contexto cultural e ideológico que lhe permite abordar sem entraves a natureza do capitalismo. Seu enfoque micro e macroeconômico é integrado e está apoiado no pressuposto da concorrência imperfeita. No universo keynesiano os mercados podem estar em equilíbrio, mas não no de pleno emprego, já segundo Kalecki o ciclo econômico é inevitável. Em ambos os casos os mercados não são perfeitos e não tendem naturalmente para o equilíbrio, o que justifica a ação reguladora do Estado, de acordo sua opção política e um código de valores preestabelecido. É de se imaginar que cada modelo de análise esteja condicionado pelo conjunto de valores dominantes do momento, o que não invalida o caracter de ciência social da economia. Por exemplo, desde a perspectiva individualista da troca mercantil, a economia se apresenta com a metodologia de ciência pura, porém, levando em conta as relações de classe social, é uma ciência moral.
Resumo:
Situada na área de história dda teoria econômica, a presente dissertação ocupa-se de uma avaliação integrada e crítica das leituras dos fundamentos da economia do emprego formulada por John Maynard Keynes empreendidas por quatro vertentes da teoria econômica comtemporânea
Resumo:
Esta pesquisa tem por objetivo contribuir para compreensão dos vínculos de lealdade que orientam a atuação dos servidores públicos brasileiros. Após a revisão bibliográfica, foram realizadas entrevistas a fim de coletar e analisar as percepções dos profissionais. O estudo focou em funcionários da segurança pública do DF, como peritos criminais, delegados e policiais, abrangendo aqueles que atuam na linha de frente, assim como chefes e dirigentes. Adaptou-se o referencial teórico de Maynard-Moody e Musheno (2003), assim como o de De Graaf (2010), acerca da lealdade dos servidores públicos e buscou-se reunir narrativas que ilustrassem situações cotidianas em que as decisões são tomadas e a discricionariedade é exercida. Nesse sentido, procurou-se investigar as instâncias de reponsabilidade mais representativas, assim como possíveis tensões e conflitos, sobretudo em um panorama em que governança e accountability estão em evidência. Os regulamentos são sempre rigorosamente cumpridos? Ou haveria um juízo de ponderação moral abrangendo outras facetas e interesses? Respondidas estas questões, procedeu-se o cotejo entre os resultados obtidos e aqueles oriundos das pesquisas referenciais. Por fim, também se procurou entabular tópicos que possam ser desenvolvidos como desdobramentos desta pesquisa.
Resumo:
The influence of 2 different levels of the inspired oxygen fraction (FiO(2)) on blood gas variables was evaluated in dogs with high intracranial pressure (ICP) during propofol anesthesia (induction followed by a continuous rate infusion [CRI] of 0.6 mg/kg/min) and intermittent positive pressure ventilation (IPPV). Eight adult mongrel dogs were anesthetized on 2 occasions, 21 d apart, and received oxygen at an FiO(2) of 1.0 (G100) or 0.6 (G60) in a randomized crossover fashion. A fiberoptic catheter was implanted on the surface of the right cerebral cortex for assessment of the ICP. An increase in the ICP was induced by temporary ligation of the jugular vein 50 min after induction of anesthesia and immediately after baseline measurement of the ICP. Blood gas measurements were taken 20 min later and then at 15-min intervals for 1 h. Numerical data were submitted to Morrison's multivariate statistical methods. The ICP, the cerebral perfusion pressure and the mean arterial pressure did not differ significantly between FiO(2) levels or measurement times after jugular ligation. The only blood gas values that differed significantly (P < 0.05) were the arterial oxygen partial pressure, which was greater with G100 than with G60 throughout the procedure, and the venous haemoglobin saturation, that was greater with G100 than with G60 at M0. There were no significant differences between FiO(2) levels or measurement times in the following blood gas variables: arterial carbon dioxide partial pressure, arterial hemoglobin saturation, base deficit, bicarbonate concentration, pH, venous oxygen partial pressure, venous carbon dioxide partial pressure and the arterial-to-end-tidal carbon dioxide difference.
Resumo:
The idea of considering imprecision in probabilities is old, beginning with the Booles George work, who in 1854 wanted to reconcile the classical logic, which allows the modeling of complete ignorance, with probabilities. In 1921, John Maynard Keynes in his book made explicit use of intervals to represent the imprecision in probabilities. But only from the work ofWalley in 1991 that were established principles that should be respected by a probability theory that deals with inaccuracies. With the emergence of the theory of fuzzy sets by Lotfi Zadeh in 1965, there is another way of dealing with uncertainty and imprecision of concepts. Quickly, they began to propose several ways to consider the ideas of Zadeh in probabilities, to deal with inaccuracies, either in the events associated with the probabilities or in the values of probabilities. In particular, James Buckley, from 2003 begins to develop a probability theory in which the fuzzy values of the probabilities are fuzzy numbers. This fuzzy probability, follows analogous principles to Walley imprecise probabilities. On the other hand, the uses of real numbers between 0 and 1 as truth degrees, as originally proposed by Zadeh, has the drawback to use very precise values for dealing with uncertainties (as one can distinguish a fairly element satisfies a property with a 0.423 level of something that meets with grade 0.424?). This motivated the development of several extensions of fuzzy set theory which includes some kind of inaccuracy. This work consider the Krassimir Atanassov extension proposed in 1983, which add an extra degree of uncertainty to model the moment of hesitation to assign the membership degree, and therefore a value indicate the degree to which the object belongs to the set while the other, the degree to which it not belongs to the set. In the Zadeh fuzzy set theory, this non membership degree is, by default, the complement of the membership degree. Thus, in this approach the non-membership degree is somehow independent of the membership degree, and this difference between the non-membership degree and the complement of the membership degree reveals the hesitation at the moment to assign a membership degree. This new extension today is called of Atanassov s intuitionistic fuzzy sets theory. It is worth noting that the term intuitionistic here has no relation to the term intuitionistic as known in the context of intuitionistic logic. In this work, will be developed two proposals for interval probability: the restricted interval probability and the unrestricted interval probability, are also introduced two notions of fuzzy probability: the constrained fuzzy probability and the unconstrained fuzzy probability and will eventually be introduced two notions of intuitionistic fuzzy probability: the restricted intuitionistic fuzzy probability and the unrestricted intuitionistic fuzzy probability
Resumo:
Este texto tem por objetivo ressaltar um aspecto que não tem sido tratado com a devida profundidade na literatura que estuda a formalização da Teoria Geral do Emprego, dos Juros e da Moeda de John Maynard Keynes (1936). Mais precisamente, o texto destaca a estratégia de formalização adotada por David G. Champernowne em seu artigo intitulado Unemployment, Basic and Monetary: the classical analysis and the keynesian, publicado em 1935-36 na Review of Economic Studies. Chamamos a atenção para o fato dele distinguir a teoria clássica da teoria de Keynes não apenas pelos pressupostos adotados por cada teoria, mas principalmente pela construção de subsistemas a partir de um sistema geral, com características recursivas (relações de causalidade) distintas. As explicações em prosa, a descrição algébrica das funções comportamentais e condições de equilíbrio e a ilustração por meio de diagramas, além da escolha de conjuntos específicos de variáveis para representar cada uma das teorias e suas diferentes versões são aspectos deste artigo de Champernowne que merecem uma análise mais minuciosa.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)