943 resultados para Comparison with optical model calculations using the Sao Paulo potential and other data
Resumo:
Studies of alcoholism etiology often focus on genetic or psy-chosocial approaches, but not both. Greater understanding of the etiology of alcohol, tobacco and other addictions will come from integration of these research traditions. A research approach is outlined to test three models for the etiology of addictions — behavioral undercontrol, pharmacologic vulnerability, negative affect regulation — addressing key questions including (i) mediators of genetic effects, (ii) genotype-environment correlation effects, (iii) genotype x environment interaction effects, (iv) the developmental unfolding of genetic and environmental effects, (v) subtyping including identification of distinct trajectories of substance involvement, (vi) identification of individual genes that contribute to risk, and (vii) the consequences of excessive use. By using coordinated research designs, including prospective assessment of adolescent twins and their siblings and parents; of adult substance dependent and control twins and their MZ and DZ cotwins, the spouses of these pairs, and their adolescent offspring; and of regular families; by selecting for gene-mapping approaches sibships screened for extreme concordance or discordance on quantitative indices of substance use; and by using experimental (drug challenge) as well as survey approaches, a number of key questions concerning addiction etiology can be addressed. We discuss complementary strengths and weaknesses of different sampling strategies, as well as methods to implement such an integrated approach illustrated for the study of alcoholism etiology. A coordinated program of twin and family studies will allow a comprehensive dissection of the interplay of genetic and environmental risk-factors in the etiology of alcoholism and other addictions.
Resumo:
Esta dissertação apresenta uma proposta de sistema capaz de preencher a lacuna entre documentos legislativos em formato PDF e documentos legislativos em formato aberto. O objetivo principal é mapear o conhecimento presente nesses documentos de maneira a representar essa coleção como informação interligada. O sistema é composto por vários componentes responsáveis pela execução de três fases propostas: extração de dados, organização de conhecimento, acesso à informação. A primeira fase propõe uma abordagem à extração de estrutura, texto e entidades de documentos PDF de maneira a obter a informação desejada, de acordo com a parametrização do utilizador. Esta abordagem usa dois métodos de extração diferentes, de acordo com as duas fases de processamento de documentos – análise de documento e compreensão de documento. O critério utilizado para agrupar objetos de texto é a fonte usada nos objetos de texto de acordo com a sua definição no código de fonte (Content Stream) do PDF. A abordagem está dividida em três partes: análise de documento, compreensão de documento e conjunção. A primeira parte da abordagem trata da extração de segmentos de texto, adotando uma abordagem geométrica. O resultado é uma lista de linhas do texto do documento; a segunda parte trata de agrupar os objetos de texto de acordo com o critério estipulado, produzindo um documento XML com o resultado dessa extração; a terceira e última fase junta os resultados das duas fases anteriores e aplica regras estruturais e lógicas no sentido de obter o documento XML final. A segunda fase propõe uma ontologia no domínio legal capaz de organizar a informação extraída pelo processo de extração da primeira fase. Também é responsável pelo processo de indexação do texto dos documentos. A ontologia proposta apresenta três características: pequena, interoperável e partilhável. A primeira característica está relacionada com o facto da ontologia não estar focada na descrição pormenorizada dos conceitos presentes, propondo uma descrição mais abstrata das entidades presentes; a segunda característica é incorporada devido à necessidade de interoperabilidade com outras ontologias do domínio legal, mas também com as ontologias padrão que são utilizadas geralmente; a terceira característica é definida no sentido de permitir que o conhecimento traduzido, segundo a ontologia proposta, seja independente de vários fatores, tais como o país, a língua ou a jurisdição. A terceira fase corresponde a uma resposta à questão do acesso e reutilização do conhecimento por utilizadores externos ao sistema através do desenvolvimento dum Web Service. Este componente permite o acesso à informação através da disponibilização de um grupo de recursos disponíveis a atores externos que desejem aceder à informação. O Web Service desenvolvido utiliza a arquitetura REST. Uma aplicação móvel Android também foi desenvolvida de maneira a providenciar visualizações dos pedidos de informação. O resultado final é então o desenvolvimento de um sistema capaz de transformar coleções de documentos em formato PDF para coleções em formato aberto de maneira a permitir o acesso e reutilização por outros utilizadores. Este sistema responde diretamente às questões da comunidade de dados abertos e de Governos, que possuem muitas coleções deste tipo, para as quais não existe a capacidade de raciocinar sobre a informação contida, e transformá-la em dados que os cidadãos e os profissionais possam visualizar e utilizar.
Resumo:
The purpose of this paper is twofold. First, we construct a DSGE model which spells out explicitly the instrumentation of monetary policy. The interest rate is determined every period depending on the supply and demand for reserves which in turn are affected by fundamental shocks: unforeseeable changes in cash withdrawal, autonomous factors, technology and government spending. Unexpected changes in the monetary conditions of the economy are interpreted as monetary shocks. We show that these monetary shocks have the usual effects on economic activity without the need of imposing additional frictions as limited participation in asset markets or sticky prices. Second, we show that this view of monetary policy may have important consequences for empirical research. In the model, the contemporaneous correlations between interest rates, prices and output are due to the simultaneous effect of all fundamental shocks. We provide an example where these contemporaneous correlations may be misinterpreted as a Taylor rule. In addition, we use the sign of the impact responses of all shocks on output, prices and interest rates derived from the model to identify the sources of shocks in the data.
Resumo:
Although the sensitivity to light of thioridazine and its metabolites has been described, the problem does not seem to be widely acknowledged. Indeed, a survey of the literature shows that assays of these compounds under light-protected conditions have been performed only in a few of the numerous analytical studies on this drug. In the present study, thioridazine, its metabolites, and 18 other neuroleptics were tested for their sensitivity to light under conditions used for their analysis. The results show that light significantly affects the analysis of thioridazine and its metabolites. It readily causes the racemization of the isomeric pairs of thioridazine 5-sulphoxide and greatly decreases the concentration of thioridazine. This sensitivity to light varied with the medium used (most sensitive in acidic media) and also with the molecule (in order of decreasing sensitivity: thioridazine > mesoridazine > sulforidazine). Degradation in neutral or basic media was slow, with the exception of mesoridazine in a neutral medium. Twelve other phenothiazines tested, as well as chlorprotixene, a thioxanthene drug, were found to be sensitive to light in acidic media, whereas flupenthixol and zuclopenthixol (two thioxanthenes), clozapine, fluperlapine, and haloperidol (a butyrophenone) did not seem to be affected. In addition to being sensitive to light, some compounds may be readily oxidized by peroxide-containing solvents.
Resumo:
Two granitic plutons, the Tso Morari gneiss and the Rupshu metagranite, crop out in the Tso Morari area. The Polokongka La granite, classically interpreted as a young intrusion in the Tso Morari gneiss, has been recognized as the undeformed facies of the latter. Conventional isotope dilution U-Pb zircon dating on single-grain and small multi-grain fractions yielded magmatic ages of 479 +/- 2 Ma for the Tso Morari gneiss and the Polokongka La granite, and 482.5 +/- 1 Ma for the Rupshu granite. There is a great difference in zircon morphology between the Tso Morari gneiss (peraluminous type) and the Rupshu granite (alkaline type). This difference is confirmed by whole-rock chemistry. The Tso Morari gneiss is a typical deformed S-type granite, resulting from crustal anatexis. On the other hand, the Rupshu granite is an essentially metaluminous alkali-calcic intrusion derived from a different source material. Data compilation from other Himalayan Cambro-Ordovician granites reveals huge and widespread magmatic activity all along and beyond the northern Indian plate between 570 and 450 Ma, with a peak at 500-480 Ma. A major, continental-scale tectonic event is required to generate such a large magmatic belt; it has been tentatively compared to the Variscan post-orogenic extensional regime of Western Europe, as a late evolution stage of a Pan-African orogenic event.
Resumo:
The brain requires a constant and substantial energy supply to maintain its main functions. For decades, it was assumed that glucose was the major if not the only significant source of energy for neurons. This view was supported by the expression of specific facilitative glucose transporters on cerebral blood vessels, as well as neurons. Despite the fact that glucose remains a key energetic substrate for the brain, growing evidence suggests a different scenario. Thus astrocytes, a major type of glial cells that express their own glucose transporter, play a critical role in coupling synaptic activity with glucose utilization. It was shown that glutamatergic activity triggers an enhancement of aerobic glycolysis in this cell type. As a result, lactate is provided to neurons as an additional energy substrate. Indeed, lactate has proven to be a preferential energy substrate for neurons under various conditions. A family of proton-linked carriers known as monocarboxylate transporters has been described and specific members have been found to be expressed by endothelial cells, astrocytes and neurons. Moreover, these transporters are subject to fine regulation of their expression levels and localization, notably in neurons, which suggests that lactate supply could be adjusted as a function of their level of activity. Considering the importance of energetics in the aetiology of several neurodegenerative diseases, a better understanding of its cellular and molecular underpinnings might have important implications for the future development of neuroprotective strategies.
Resumo:
This thesis reviews the role of nuclear and conventional power plants in the future energy system. The review is done by utilizing freely accesible publications in addition to generating load duration and ramping curves for Nordic energy system. As the aim of the future energy system is to reduce GHG-emissions and avoid further global warming, the need for flexible power generation increases with the increased share of intermittent renewables. The goal of this thesis is to offer extensive understanding of possibilities and restrictions that nuclear power and conventional power plants have regarding flexible and sustainable generation. As a conclusion, nuclear power is the only technology that is able to provide large scale GHG-free power output variations with good ramping values. Most of the currently operating plants are able to take part in load following as the requirement to do so is already required to be included in the plant design. Load duration and ramping curves produced prove that nuclear power is able to cover most of the annual generation variation and ramping needs in the Nordic energy system. From the conventional power generation methods, only biomass combustion can be considered GHG-free because biomass is considered carbon neutral. CFB combusted biomass has good load follow capabilities in good ramping and turndown ratios. All the other conventional power generation technologies generate GHG-emissions and therefore the use of these technologies should be reduced.
Resumo:
Hunting foxes with hounds has been a countryside pursuit in Britain since the 17th Century, but its effect nationally on habitat management is little understood by the general public. A survey questionnaire was distributed to 163 mounted fox hunts of England and Wales to quantify their management practices in woodland and other habitat. Ninety-two hunts (56%), covering 75,514 km(2), returned details on woodland management motivated by the improvement of their sport. The management details were verified via on-site visits for a sample of 200 woodlands. Following verification, the area of woodlands containing the management was conservatively estimated at 24,053 (+/- 2241) ha, comprising 5.9% of woodland area within the whole of the area hunted by the 92 hunts. Management techniques included: tree planting, coppicing, felling, ride and perimeter management. A case study in five hunt countries in southern England examined, through the use of botanical survey and butterfly counts, the consequences of the hunt management on woodland ground flora and butterflies. Managed areas had, within the last 5 years, been coppiced and rides had been cleared. Vegetation cover in managed and unmanaged sites averaged 86% and 64%, respectively, and managed areas held on average 4 more plant species and a higher plant diversity than unmanaged areas (Shannon index of diversity: 2.25 vs. 1.95). Both the average number of butterfly species (2.2 vs. 0.3) and individuals counted (4.6 vs. 0.3) were higher in the managed than unmanaged sites.
Resumo:
Humans’ unique cognitive abilities are usually attributed to a greatly expanded neocortex, which has been described as “the crowning achievement of evolution and the biological substrate of human mental prowess” [1]. The human cerebellum, however, contains four times more neurons than the neocortex [2] and is attracting increasing attention for its wide range of cognitive functions. Using a method for detecting evolutionary rate changes along the branches of phylogenetic trees, we show that the cerebellum underwent rapid size increase throughout the evolution of apes, including humans, expanding significantly faster than predicted by the change in neocortex size. As a result, humans and other apes deviated significantly from the general evolutionary trend for neocortex and cerebellum to change in tandem, having significantly larger cerebella relative to neocortex size than other anthropoid primates. These results suggest that cerebellar specialization was a far more important component of human brain evolution than hitherto recognized and that technical intelligence was likely to have been at least as important as social intelligence in human cognitive evolution. Given the role of the cerebellum in sensory-motor control and in learning complex action sequences, cerebellar specialization is likely to have underpinned the evolution of humans’ advanced technological capacities, which in turn may have been a preadaptation for language.
Resumo:
Video exposure monitoring (VEM) is a group of methods used for occupational hygiene studies. The method is based on a combined use of video recordings with measurements taken with real-time monitoring instruments. A commonly used name for VEM is PIMEX. Since PIMEX initially was invented in the mid 1980’s have the method been implemented and developed in a number of countries. With the aim to give an updated picture of how VEM methods are used and to investigate needs for further development have a number of workshops been organised in Finland, UK, the Netherlands, Germany and Austria. Field studies have also been made with the aim to study to what extent the PIMEX method can improve workers motivation to actively take part in actions aimed at workplace improvements.The results from the workshops illustrates clearly that there is an impressive amount of experiences and ideas for the use of VEM within the network of the groups participating in the workshops. The sharing of these experiences between the groups, as well as dissemination of it to wider groups is, however, limited. The field studies made together with a number of welders indicate that their motivation to take part in workplace improvements is improved after the PIMEX intervention. The results are however not totally conclusive and further studies focusing on motivation are called for.It is recommended that strategies for VEM, for interventions in single workplaces, as well as for exposure categorisation and production of training material are further developed. It is also recommended to conduct a research project with the intention of evaluating the effects of the use of VEM as well as to disseminate knowledge about the potential of VEM to occupational hygiene experts and others who may benefit from its use.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
CD40 ligand (CD40L) deficiency or X-linked hyper-IgM syndrome (X-HIGM) is a well-described primary immunodeficiency in which Pneumocystis jiroveci pneumonia is a common clinical feature. We have identified an unusual high incidence of fungal infections and other not yet described infections in a cohort of 11 X-HIGM patients from nine unrelated Brazilian families. Among these, we describe the first case of paracoccidioidomycosis (PCM) in X-HIGM. The molecular genetic analysis of CD40L was performed by gene sequencing and evaluation of CD40L protein expression. Nine of these 11 patients (82%) had fungal infections. These included fungal species common to CD40L deficiency (P. jiroveci and Candida albicans) as well as Paracoccidioides brasiliensis. One patient presented with PCM at age 11 years and is now doing well at 18 years of age. Additionally, one patient presented with a simultaneous infection with Klebsiella and Acinetobacter, and one with condyloma caused by human papilloma virus. Molecular analysis revealed four previously described CD40L mutations, two novel missense mutations (c.433 T>G and c.476 G>C) resulting in the absence of CD40L protein expression by activated CD4(+) cells and one novel insertion (c.484_485insAA) within the TNFH domain leading to a frame shift and premature stop codon. These observations demonstrated that the susceptibility to fungal infections in X-HIGM extends beyond those typically associated with X-HIGM (P. jiroveci and C. albicans) and that these patients need to be monitored for those pathogens.
Resumo:
The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.
Resumo:
The effectiveness of fluoride in caries prevention has been convincingly proven. In recent years, researchers have investigated the preventive effects of different fluoride formulations on erosive tooth wear with positive results, but their action on caries and erosion prevention must be based on different requirements, because there is no sheltered area in the erosive process as there is in the subsurface carious lesions. Thus, any protective mechanism from fluoride concerning erosion is limited to the surface or the near surface layer of enamel. However, reports on other protective agents show superior preventive results. The mechanism of action of tin-containing products is related to tin deposition onto the tooth surface, as well as the incorporation of tin into the near-surface layer of enamel. These tin-rich deposits are less susceptible to dissolution and may result in enhanced protection of the underlying tooth. Titanium tetrafluoride forms a protective layer on the tooth surface. It is believed that this layer is made up of hydrated hydrogen titanium phosphate. Products containing phosphates and/or proteins may adsorb either to the pellicle, rendering it more protective against demineralization, or directly to the dental hard tissue, probably competing with H(+) at specific sites on the tooth surface. Other substances may further enhance precipitation of calcium phosphates on the enamel surface, protecting it from additional acid impacts. Hence, the future of fluoride alone in erosion prevention looks grim, but the combination of fluoride with protective agents, such as polyvalent metal ions and some polymers, has much brighter prospects.
Resumo:
Evidence for the dissolution of biogenic silica at the base of pelagic sections supports the hypothesis that much of the chert formed in the Pacific derives from the dissolution and reprecipitation of this silica by hydrothermal waters. As ocean bottom waters flow into and through the crust, they become warmer. Initially they remain less saturated with respect to dissolved silica than pore water in the overlying sediments. With the diffusion of heat, dissolved ions, and to some extent the advection of water itself, biogenic silica in the basal part of the sedimentary section is dissolved. Upon conductively cooling, these pore waters precipitate chert layers. The most common thickness for the basal silica-free zone (20 m) lies below the most common height of the top of the chert interval above basement (50 m). This mode of chert formation explains the frequent occurrence of chert layers at very shallow subbottom depths in pelagic sections of the Pacific. It is also consistent with the common occurrence of cherts =150 m above basement.