995 resultados para Classical-quantum interfaces
Resumo:
The computation of the optical conductivity of strained and deformed graphene is discussed within the framework of quantum field theory in curved spaces. The analytical solutions of the Dirac equation in an arbitrary static background geometry for one dimensional periodic deformations are computed, together with the corresponding Dirac propagator. Analytical expressions are given for the optical conductivity of strained and deformed graphene associated with both intra and interbrand transitions. The special case of small deformations is discussed and the result compared to the prediction of the tight-binding model.
Resumo:
OBJETIVO: Identificar na literatura elementos para explicar uma possível associação entre o transtorno de déficit de atenção e hiperatividade (TDAH) e a epilepsia e orientar quanto ao manejo clínico dos pacientes que compartilham esses transtornos. MÉTODOS: Realizou-se revisão da literatura dos últimos 10 anos nas bases de dados MedLine e Lilacs com a combinação dos descritores "attention deficit hyperactivity disorder", "ADHD" e "epilepsy". RESULTADOS: Sintomas de TDAH são frequentes em síndromes epilépticas idiopáticas. Vários fatores podem contribuir para a coexistência desses transtornos: 1) possibilidade de uma mesma propensão genética; 2) participação dos neurotransmissores noradrenalina e dopamina no TDAH e na modulação da excitabilidade neuronal; 3) anormalidades estruturais do cérebro evidenciadas em epilépticos portadores de TDAH; 4) influência dos efeitos crônicos das crises e das descargas epileptiformes interictais sob a atenção; 5) efeitos adversos das drogas antiepilépticas sob a cognição. CONCLUSÕES: As evidências atuais apontam que crises epilépticas e TDAH podem apresentar bases neurobiológicas comuns. Estudos que avaliam disfunções nas vias de sinalização das catecolaminas cerebrais e o papel das descargas epileptiformes interictais na geração dos sintomas são fundamentais na investigação desses mecanismos. Drogas psicoestimulantes são seguras e eficazes para o tratamento do TDAH na maioria dos portadores de epilepsia.
Resumo:
The bottom of the Red Sea harbors over 25 deep hypersaline anoxic basins that are geochemically distinct and characterized by vertical gradients of extreme physicochemical conditions. Because of strong changes in density, particulate and microbial debris get entrapped in the brine-seawater interface (BSI), resulting in increased dissolved organic carbon, reduced dissolved oxygen toward the brines and enhanced microbial activities in the BSI. These features coupled with the deep-sea prevalence of ammonia-oxidizing archaea (AOA) in the global ocean make the BSI a suitable environment for studying the osmotic adaptations and ecology of these important players in the marine nitrogen cycle. Using phylogenomic-based approaches, we show that the local archaeal community of five different BSI habitats (with up to 18.2% salinity) is composed mostly of a single, highly abundant Nitrosopumilus-like phylotype that is phylogenetically distinct from the bathypelagic thaumarchaea; ammonia-oxidizing bacteria were absent. The composite genome of this novel Nitrosopumilus-like subpopulation (RSA3) co-assembled from multiple single-cell amplified genomes (SAGs) from one such BSI habitat further revealed that it shares [sim]54% of its predicted genomic inventory with sequenced Nitrosopumilus species. RSA3 also carries several, albeit variable gene sets that further illuminate the phylogenetic diversity and metabolic plasticity of this genus. Specifically, it encodes for a putative proline-glutamate 'switch' with a potential role in osmotolerance and indirect impact on carbon and energy flows. Metagenomic fragment recruitment analyses against the composite RSA3 genome, Nitrosopumilus maritimus, and SAGs of mesopelagic thaumarchaea also reiterate the divergence of the BSI genotypes from other AOA.
Resumo:
Los eventos transitorios únicos analógicos (ASET, Analog Single Event Transient) se producen debido a la interacción de un ión pesado o un protón de alta energía con un dispositivo sensible de un circuito analógico. La interacción del ión con un transistor bipolar o de efecto de campo MOS induce pares electrón-hueco que provocan picos que pueden propagarse a la salida del componente analógico provocando transitorios que pueden inducir fallas en el nivel sistema. Los problemas más graves debido a este tipo de fenómeno se dan en el medioambiente espacial, muy rico en iones pesados. Casos típicos los constituyen las computadoras de a bordo de satélites y otros artefactos espaciales. Sin embargo, y debido a la continua contracción de dimensiones de los transistores (que trae aparejado un aumento de sensibilidad), este fenómeno ha comenzado a observarse a nivel del mar, provocado fundamentalmente por el impacto de neutrones atmosféricos. Estos efectos pueden provocar severos problemas a los sistemas informáticos con interfaces analógicas desde las que obtienen datos para el procesamiento y se han convertido en uno de los problemas más graves a los que tienen que hacer frente los diseñadores de sistemas de alta escala de integración. Casos típicos son los Sistemas en Chip que incluyen módulos de procesamiento de altas prestaciones como las interfaces analógicas.El proyecto persigue como objetivo general estudiar la susceptibilidad de sistemas informáticos a ASETs en sus secciones analógicas, proponiendo estrategias para la mitigación de los errores.Como objetivos específicos se pretende: -Proponer nuevos modelos de ASETs basados en simulaciones en el nivel dispositivo y resueltas por el método de elementos finitos.-Utilizar los modelos para identificar las secciones más propensas a producir errores y consecuentemente para ser candidatos a la aplicación de técnicas de endurecimiento a radiaciones.-Utilizar estos modelos para estudiar la naturaleza de los errores producidos en sistemas de procesamiento de datos.-Proponer soluciones novedosas para la mitigación de estos efectos en los mismos circuitos analógicos evitando su propagación a las secciones digitales.-Proponer soluciones para la mitigación de los efectos en el nivel sistema.Para llevar a cabo el proyecto se plantea un procedimiento ascendente para las investigaciones a realizar, comenzando por descripciones en el nivel físico para posteriormente aumentar el nivel de abstracción en el que se encuentra modelado el circuito. Se propone el modelado físico de los dispositivos MOS y su resolución mediante el Método de Elementos Finitos. La inyección de cargas en las zonas sensibles de los modelos permitirá determinar los perfiles de los pulsos de corriente que deben inyectarse en el nivel circuito para emular estos efectos. Estos procedimientos se realizarán para los distintos bloques constructivos de las interfaces analógicas, proponiendo estrategias de mitigación de errores en diferentes niveles.Los resultados esperados del presente proyecto incluyen hardware para detección de errores y tolerancia a este tipo de eventos que permitan aumentar la confiabilidad de sistemas de tratamiento de la información, así como también nuevos datos referentes a efectos de la radiación en semiconductores, nuevos modelos de fallas transitorias que permitan una simulación de estos eventos en el nivel circuito y la determinación de zonas sensibles de interfaces analógicas típicas que deben ser endurecidas para radiación.
Resumo:
El procedimiento de revertir la dinámica colectiva (diablillo de Loschmidt apresurado) mediante un pulso de radio frecuencia, permite generar un Eco de Loschmidt, es decir la refocalización de una excitación localizada. Alternativamente, en acústica es posible implementar un Espejo de Reversión Temporal, que consiste en la progresiva inyección de una débil excitación ultrasónica en la periferia de un sistema, para construir una excitación que se propaga "hacia atrás". Así, podemos afirmar que es posible revertir y controlar la dinámica. Sin embargo, aún no se posee una comprensión detallada de los mecanismos que gobiernan estos procedimientos. Este proyecto busca responder las preguntas que posibilitan esta comprensión.
Resumo:
The classical central limit theorem states the uniform convergence of the distribution functions of the standardized sums of independent and identically distributed square integrable real-valued random variables to the standard normal distribution function. While first versions of the central limit theorem are already due to Moivre (1730) and Laplace (1812), a systematic study of this topic started at the beginning of the last century with the fundamental work of Lyapunov (1900, 1901). Meanwhile, extensions of the central limit theorem are available for a multitude of settings. This includes, e.g., Banach space valued random variables as well as substantial relaxations of the assumptions of independence and identical distributions. Furthermore, explicit error bounds are established and asymptotic expansions are employed to obtain better approximations. Classical error estimates like the famous bound of Berry and Esseen are stated in terms of absolute moments of the random summands and therefore do not reflect a potential closeness of the distributions of the single random summands to a normal distribution. Non-classical approaches take this issue into account by providing error estimates based on, e.g., pseudomoments. The latter field of investigation was initiated by work of Zolotarev in the 1960's and is still in its infancy compared to the development of the classical theory. For example, non-classical error bounds for asymptotic expansions seem not to be available up to now ...
Resumo:
Magdeburg, Univ., Fak. für Naturwiss., Diss., 2009
Resumo:
Magdeburg, Univ., Fak. für Naturwiss., Diss., 2010
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
The last 20 years have seen a significant evolution in the literature on horizontal inequity (HI) and have generated two major and "rival" methodological strands, namely, classical HI and reranking. We propose in this paper a class of ethically flexible tools that integrate these two strands. This is achieved using a measure of inequality that merges the well-known Gini coefficient and Atkinson indices, and that allows a decomposition of the total redistributive effect of taxes and transfers in a vertical equity effect and a loss of redistribution due to either classical HI or reranking. An inequality-change approach and a money-metric cost-of-inequality approach are developed. The latter approach makes aggregate classical HI decomposable across groups. As in recent work, equals are identified through a nonparametric estimation of the joint density of gross and net incomes. An illustration using Canadian data from 1981 to 1994 shows a substantial, and increasing, robust erosion of redistribution attributable both to classical HI and to reranking, but does not reveal which of reranking or classical HI is more important since this requires a judgement that is fundamentally normative in nature.
Resumo:
The interfaces between the intrapsychic, interactional, and intergenerational domains are a new frontier. As a pilot, we exposed ourselves to a complex but controllable situation as viewed by people whose main interest is in one of the three interfaces; we also fully integrated the subjects in the team, to learn about their subjective perspectives and to provide them with an enriching experience. We started with a brief "triadification" sequence (i.e., moving from a "two plus one" to a "three together" family organization). Considering this sequence as representing at a micro level many larger family transitions, we proceeded with a microanalytic interview, a psychodynamic investigation, and a family interview. As expected, larger patterns of correspondences are emerging. Central questions under debate are: What are the most appropriate units at each level of description and what are their articulations between these levels? What is the status of "triadification"? Les interfaces entre les domaines intrapsychiques, interactionnels et intergénérationnels représentent une nouvelle frontiére. A titre exploratoire, nous nous sommes exposés à une situation complexe mais contrǒlable ainsi que le voient ceux dont I'intérět principal se porte sur l'une de ces trois interfaces. Nous avons aussi entièrement intégré les sujets dans l'équipe, de facon à comprendre leur perspective subjective et à leur offrir une expérience enrichissante. Nous avons commencé avec une brève séquence de "triadification," c'est-à-dire passer d'une organisation familiale "deux plus un" à Ltne organisation familiale "trois (add sentenc)ensemble." Considérant cette séquence comme representative à un niveau microscopique de transitions familiales bien plus larges, nous avons procedé à l'entretien microanalytique, à une enquěte psychodynamique et à un entretien familial. Comme prévu, de grands patterns de correspondances émergent. Les questions essentielles sur lesquelles portent le débat sont: quelles les unités les plus appropiées à chaque niveau de description et quelles sont les articulations entre ces niveaux? Quel est le statut de la "triadification"?
Resumo:
Three exceptional modular invariants of SU(4) exist at levels 4, 6 and 8. They can be obtained from appropriate conformal embeddings and the corresponding graphs have self-fusion. From these embeddings, or from their associated modular invariants, we determine the algebras of quantum symmetries, obtain their generators,and, as a by-product, recover the known graphs E4, E6 and E8 describing exceptional quantum subgroups of type SU(4). We also obtain characteristic numbers (quantum cardinalities, dimensions) for each of them and for their associated quantum groupoïds.
Resumo:
Only few cases of classical phenylketonuria (PKU) in premature infants have been reported. Treatment of these patients is challenging due to the lack of a phenylalanine-free amino acid solution for parenteral infusion. The boy was born at 27 weeks of gestation with a weight of 1000 g (P10). He received parenteral nutrition with a protein intake of 3 g/kg/day. On day 7 he was diagnosed with classical PKU (genotype IVS10-11G>A/IVS12+ 1G>A) due to highly elevated phenylalanine (Phe) level in newborn screening (2800 micromol/L). His maximum plasma Phe level reached 3696 micromol/L. Phe intake was stopped for 4 days. During this time the boy received intravenous glucose and lipids as well as little amounts of Phe-free formula by a nasogastric tube. Due to a deficit of essential amino acids and insufficient growth, a parenteral nutrition rich in branched-chain amino-acids and relatively poor in Phe was added, in order to promote protein synthesis without overloading in Phe. Under this regimen, Phe plasma levels normalized on day 19 when intake of natural protein was started. The boy has now a corrected age of 2 years. He shows normal growth parameters and psychomotor development. Despite a long period of highly elevated Phe levels in the postnatal period our patient shows good psychomotor development. The management of premature infants with PKU depends on the child's tolerance to enteral nutrition. It demands an intensive follow-up by an experienced team and dedicated dietician. Appropriate Phe-free parenteral nutrition would be necessary especially in case of gastro-intestinal complications of prematurity.