968 resultados para Recent Publications of Note
Resumo:
Understanding the overall catalytic activity trend for rational catalyst design is one of the core goals in heterogeneous catalysis. In the past two decades, the development of density functional theory (DFT) and surface kinetics make it feasible to theoretically evaluate and predict the catalytic activity variation of catalysts within a descriptor-based framework. Thereinto, the concept of the volcano curve, which reveals the general activity trend, usually constitutes the basic foundation of catalyst screening. However, although it is a widely accepted concept in heterogeneous catalysis, its origin lacks a clear physical picture and definite interpretation. Herein, starting with a brief review of the development of the catalyst screening framework, we use a two-step kinetic model to refine and clarify the origin of the volcano curve with a full analytical analysis by integrating the surface kinetics and the results of first-principles calculations. It is mathematically demonstrated that the volcano curve is an essential property in catalysis, which results from the self-poisoning effect accompanying the catalytic adsorption process. Specifically, when adsorption is strong, it is the rapid decrease of surface free sites rather than the augmentation of energy barriers that inhibits the overall reaction rate and results in the volcano curve. Some interesting points and implications in assisting catalyst screening are also discussed based on the kinetic derivation. Moreover, recent applications of the volcano curve for catalyst design in two important photoelectrocatalytic processes (the hydrogen evolution reaction and dye-sensitized solar cells) are also briefly discussed.
Resumo:
Models of neutrino-driven core-collapse supernova explosions have matured considerably in recent years. Explosions of low-mass progenitors can routinely be simulated in 1D, 2D, and 3D. Nucleosynthesis calculations indicate that these supernovae could be contributors of some lighter neutron-rich elements beyond iron. The explosion mechanism of more massive stars remains under investigation, although first 3D models of neutrino-driven explosions employing multi-group neutrino transport have become available. Together with earlier 2D models and more simplified 3D simulations, these have elucidated the interplay between neutrino heating and hydrodynamic instabilities in the post-shock region that is essential for shock revival. However, some physical ingredients may still need to be added/improved before simulations can robustly explain supernova explosions over a wide range of progenitors. Solutions recently suggested in the literature include uncertainties in the neutrino rates, rotation, and seed perturbations from convective shell burning. We review the implications of 3D simulations of shell burning in supernova progenitors for the ‘perturbations-aided neutrino-driven mechanism,’ whose efficacy is illustrated by the first successful multi-group neutrino hydrodynamics simulation of an 18 solar mass progenitor with 3D initial conditions. We conclude with speculations about the impact of 3D effects on the structure of massive stars through convective boundary mixing.
Resumo:
Collisionless shocks, that is shocks mediated by electromagnetic processes, are customary in space physics and in astrophysics. They are to be found in a great variety of objects and environments: magnetospheric and heliospheric shocks, supernova remnants, pulsar winds and their nebulæ, active galactic nuclei, gamma-ray bursts and clusters of galaxies shock waves. Collisionless shock microphysics enters at different stages of shock formation, shock dynamics and particle energization and/or acceleration. It turns out that the shock phenomenon is a multi-scale non-linear problem in time and space. It is complexified by the impact due to high-energy cosmic rays in astrophysical environments. This review adresses the physics of shock formation, shock dynamics and particle acceleration based on a close examination of available multi-wavelength or in situ observations, analytical and numerical developments. A particular emphasis is made on the different instabilities triggered during the shock formation and in association with particle acceleration processes with regards to the properties of the background upstream medium. It appears that among the most important parameters the background magnetic field through the magnetization and its obliquity is the dominant one. The shock velocity that can reach relativistic speeds has also a strong impact over the development of the micro-instabilities and the fate of particle acceleration. Recent developments of laboratory shock experiments has started to bring some new insights in the physics of space plasma and astrophysical shock waves. A special section is dedicated to new laser plasma experiments probing shock physics.
Resumo:
Tese de Doutoramento, Química, Especialização em Química Orgânica, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2016
Resumo:
In knowledge technology work, as expressed by the scope of this conference, there are a number of communities, each uncovering new methods, theories, and practices. The Library and Information Science (LIS) community is one such community. This community, through tradition and innovation, theories and practice, organizes knowledge and develops knowledge technologies formed by iterative research hewn to the values of equal access and discovery for all. The Information Modeling community is another contributor to knowledge technologies. It concerns itself with the construction of symbolic models that capture the meaning of information and organize it in ways that are computer-based, but human understandable. A recent paper that examines certain assumptions in information modeling builds a bridge between these two communities, offering a forum for a discussion on common aims from a common perspective. In a June 2000 article, Parsons and Wand separate classes from instances in information modeling in order to free instances from what they call the “tyranny” of classes. They attribute a number of problems in information modeling to inherent classification – or the disregard for the fact that instances can be conceptualized independent of any class assignment. By faceting instances from classes, Parsons and Wand strike a sonorous chord with classification theory as understood in LIS. In the practice community and in the publications of LIS, faceted classification has shifted the paradigm of knowledge organization theory in the twentieth century. Here, with the proposal of inherent classification and the resulting layered information modeling, a clear line joins both the LIS classification theory community and the information modeling community. Both communities have their eyes turned toward networked resource discovery, and with this conceptual conjunction a new paradigmatic conversation can take place. Parsons and Wand propose that the layered information model can facilitate schema integration, schema evolution, and interoperability. These three spheres in information modeling have their own connotation, but are not distant from the aims of classification research in LIS. In this new conceptual conjunction, established by Parsons and Ward, information modeling through the layered information model, can expand the horizons of classification theory beyond LIS, promoting a cross-fertilization of ideas on the interoperability of subject access tools like classification schemes, thesauri, taxonomies, and ontologies. This paper examines the common ground between the layered information model and faceted classification, establishing a vocabulary and outlining some common principles. It then turns to the issue of schema and the horizons of conventional classification and the differences between Information Modeling and Library and Information Science. Finally, a framework is proposed that deploys an interpretation of the layered information modeling approach in a knowledge technologies context. In order to design subject access systems that will integrate, evolve and interoperate in a networked environment, knowledge organization specialists must consider a semantic class independence like Parsons and Wand propose for information modeling.
Resumo:
Biogeography and metacommunity ecology provide two different perspectives on species diversity. Both are spatial in nature but their spatial scales do not necessarily match. With recent boom of metacommunity studies, we see an increasing need for clear discrimination of spatial scales relevant for both perspectives. This discrimination is a necessary prerequisite for improved understanding of ecological phenomena across scales. Here we provide a case study to illustrate some spatial scale-dependent concepts in recent metacommunity studies and identify potential pitfalls. We presented here the diversity patterns of Neotropical lepidopterans and spiders viewed both from metacommunity and biogeographical perspectives. Specifically, we investigated how the relative importance of niche- and dispersal-based processes for community assembly change at two spatial scales: metacommunity scale, i.e. within a locality, and biogeographical scale, i.e. among localities widely scattered along a macroclimatic gradient. As expected, niche-based processes dominated the community assembly at metacommunity scale, while dispersal-based processes played a major role at biogeographical scale for both taxonomical groups. However, we also observed small but significant spatial effects at metacommunity scale and environmental effects at biogeographical scale. We also observed differences in diversity patterns between the two taxonomical groups corresponding to differences in their dispersal modes. Our results thus support the idea of continuity of processes interactively shaping diversity patterns across scales and emphasize the necessity of integration of metacommunity and biogeographical perspectives.
Resumo:
Simopelta minima (Brandão, 1989) was originally described based on four workers collected in soil samples from a small cocoa plantation in Ilhéus, state of Bahia, northeastern Brazil. In the subsequent years after the description, this cocoa plantation was eliminated and the species was then considered extinct by the Brazilian environmental institutions. The recent rediscovery of S. minima workers in subterranean pitfall trap samples from Viçosa, state of Minas Gerais, southeastern Brazil, over 1.000 km distant from type locality, suggests that the rarity and vulnerability status of some ant species may be explained by insufficient sampling of adequate microhabitats, in time and space.
Resumo:
Ayahuasca is a hallucinogenic beverage prepared by the decoction of plants native to the Amazon Basin region. The beverage has been used throughout the world by members of some syncretic religious movements. Despite the recent legalization of ayahuasca in Brazil for religious purposes, there is little pre-clinical and clinical information attesting to its safety, particularly in relation to the use during pregnancy. The aim of the current work was to determine the effects of perinatal exposure to ayahuasca (from the 6th day of pregnancy to the 10th day of lactation) on physical, reflexology and neurobehavioral parameters of the Wistar rat offspring. The offspring showed no statistically significant changes in the physical and reflexology parameters evaluated. However, in adult rats, perinatally exposed to ayahuasca, an increase in frequency of entries in open arms in elevated plus-maze test, a decrease in total time of interaction in social interaction test, a decrease in time of latency for the animal to start swimming and a decrease of the minimum convulsant dose induced by pentylenetetrazol were observed. In conclusion, our results showed that the use of ayahuasca by mothers during pregnancy and lactation reduced the general anxiety and social motivation of the rat offspring. Besides, it promoted a higher sensitivity for initiation and spread of seizure activity.
Resumo:
Purpose - The aim of this paper is to briefly present aspects of public brownfield management policies from Brazilian and German points of view. Design methodology approach - The data collection method combined literature and documental research. The bibliography included Brazilian and German literature about brownfield management. The documental research includes Brazilian and German legislation and official documents published by CETESB, the Environmental Company of the State of São Paulo, Brazil. Furthermore, publications of German governmental research institutions have been integrated in the paper. Findings - In Brazil, despite the lack of a federal public policy, the State of São Paulo has approved specific rules to deal with contaminated sites. Topics that could be targets of scientific studies have been identified. Experiences in Germany show that it is essential to have political will and cooperation between the different political levels and technical disciplines. Partnerships between German and Brazilian universities would be welcome as there is a wide range of opportunities for academic post-graduation studies and research focusing on human resources capacitation in environmental management. Originality value - The paper makes an original contribution of exploring an area (brownfield management) that is at the forefront of discussion in academe and industry
Resumo:
The structural engineering community in Brazil faces new challenges with the recent occurrence of high intensity tornados. Satellite surveillance data shows that the area covering the south-east of Brazil, Uruguay and some of Argentina is one of the world most tornado-prone areas, second only to the infamous tornado alley in central United States. The design of structures subject to tornado winds is a typical example of decision making in the presence of uncertainty. Structural design involves finding a good balance between the competing goals of safety and economy. This paper presents a methodology to find the optimum balance between these goals in the presence of uncertainty. In this paper, reliability-based risk optimization is used to find the optimal safety coefficient that minimizes the total expected cost of a steel frame communications tower, subject to extreme storm and tornado wind loads. The technique is not new, but it is applied to a practical problem of increasing interest to Brazilian structural engineers. The problem is formulated in the partial safety factor format used in current design codes, with all additional partial factor introduced to serve as optimization variable. The expected cost of failure (or risk) is defined as the product of a. limit state exceedance probability by a limit state exceedance cost. These costs include costs of repairing, rebuilding, and paying compensation for injury and loss of life. The total expected failure cost is the sum of individual expected costs over all failure modes. The steel frame communications, tower subject of this study has become very common in Brazil due to increasing mobile phone coverage. The study shows that optimum reliability is strongly dependent on the cost (or consequences) of failure. Since failure consequences depend oil actual tower location, it turn,,; out that different optimum designs should be used in different locations. Failure consequences are also different for the different parties involved in the design, construction and operation of the tower. Hence, it is important that risk is well understood by the parties involved, so that proper contracts call be made. The investigation shows that when non-structural terms dominate design costs (e.g, in residential or office buildings) it is not too costly to over-design; this observation is in agreement with the observed practice for non-optimized structural systems. In this situation, is much easier to loose money by under-design. When by under-design. When structural material cost is a significant part of design cost (e.g. concrete dam or bridge), one is likely to lose significantmoney by over-design. In this situation, a cost-risk-benefit optimization analysis is highly recommended. Finally, the study also shows that under time-varying loads like tornados, the optimum reliability is strongly dependent on the selected design life.
Resumo:
Xylella fastidiosa is a Gram negative plant pathogen causing many economically important diseases, and analyses of completely sequenced X. fastidiosa genome strains allowed the identification of many prophage-like elements and possibly phage remnants, accounting for up to 15% of the genome composition. To better evaluate the recent evolution of the X. fastidiosa chromosome backbone among distinct pathovars, the number and location of prophage-like regions on two finished genomes (9a5c and Temecula1), and in two candidate molecules (Ann1 and Dixon) were assessed. Based on comparative best bidirectional hit analyses, the majority (51%) of the predicted genes in the X. fastidiosa prophage-like regions are related to structural phage genes belonging to the Siphoviridae family. Electron micrograph reveals the existence of putative viral particles with similar morphology to lambda phages in the bacterial cell in planta. Moreover, analysis of microarray data indicates that 9a5c strain cultivated under stress conditions presents enhanced expression of phage anti-repressor genes, suggesting switches from lysogenic to lytic cycle of phages under stress-induced situations. Furthermore, virulence-associated proteins and toxins are found within these prophage-like elements, thus suggesting an important role in host adaptation. Finally, clustering analyses of phage integrase genes based on multiple alignment patterns reveal they group in five lineages, all possessing a tyrosine recombinase catalytic domain, and phylogenetically close to other integrases found in phages that are genetic mosaics and able to perform generalized and specialized transduction. Integration sites and tRNA association is also evidenced. In summary, we present comparative and experimental evidence supporting the association and contribution of phage activity on the differentiation of Xylella genomes.
Resumo:
Based on high-resolution (R approximate to 42 000 to 48 000) and high signal-to-noise (S/N approximate to 50 to 150) spectra obtained with UVES/VLT, we present detailed elemental abundances (O, Na, Mg, Al, Si, Ca, Ti, Cr, Fe, Ni, Zn, Y, and Ba) and stellar ages for 12 new microlensed dwarf and subgiant stars in the Galactic bulge. Including previous microlensing events, the sample of homogeneously analysed bulge dwarfs has now grown to 26. The analysis is based on equivalent width measurements and standard 1-D LTE MARCS model stellar atmospheres. We also present NLTE Li abundances based on line synthesis of the (7)Li line at 670.8 nm. The results from the 26 microlensed dwarf and subgiant stars show that the bulge metallicity distribution (MDF) is double-peaked; one peak at [Fe/H] approximate to -0.6 and one at [Fe/H] approximate to +0.3, and with a dearth of stars around solar metallicity. This is in contrast to the MDF derived from red giants in Baade's window, which peaks at this exact value. A simple significance test shows that it is extremely unlikely to have such a gap in the microlensed dwarf star MDF if the dwarf stars are drawn from the giant star MDF. To resolve this issue we discuss several possibilities, but we can not settle on a conclusive solution for the observed differences. We further find that the metal-poor bulge dwarf stars are predominantly old with ages greater than 10 Gyr, while the metal-rich bulge dwarf stars show a wide range of ages. The metal-poor bulge sample is very similar to the Galactic thick disk in terms of average metallicity, elemental abundance trends, and stellar ages. Speculatively, the metal-rich bulge population might be the manifestation of the inner thin disk. If so, the two bulge populations could support the recent findings, based on kinematics, that there are no signatures of a classical bulge and that the Milky Way is a pure-disk galaxy. Also, recent claims of a flat IMF in the bulge based on the MDF of giant stars may have to be revised based on the MDF and abundance trends probed by our microlensed dwarf stars.
Resumo:
We present Monte Carlo simulations for a molecular motor system found in virtually all eukaryotic cells, the acto-myosin motor system, composed of a group of organic macromolecules. Cell motors were mapped to an Ising-like model, where the interaction field is transmitted through a tropomyosin polymer chain. The presence of Ca(2+) induces tropomyosin to block or unblock binding sites of the myosin motor leading to its activation or deactivation. We used the Metropolis algorithm to find the transient and the equilibrium states of the acto-myosin system composed of solvent, actin, tropomyosin, troponin, Ca(2+), and myosin-S1 at a given temperature, including the spatial configuration of tropomyosin on the actin filament surface. Our model describes the short- and long-range cooperativity during actin-myosin binding which emerges from the bending stiffness of the tropomyosin complex. We found all transition rates between the states only using the interaction energy of the constituents. The agreement between our model and experimental data also supports the recent theory of flexible tropomyosin.
Resumo:
We prove an extension of the classical isomorphic classification of Banach spaces of continuous functions on ordinals. As a consequence, we give complete isomorphic classifications of some Banach spaces K(X,Y(n)), eta >= omega, of compact operators from X to Y(eta), the space of all continuous Y-valued functions defined in the interval of ordinals [1, eta] and equipped with the supremum norm. In particular, under the Continuum Hypothesis, we extend a recent result of C. Samuel by classifying, up to isomorphism, the spaces K(X(xi), c(0)(Gamma)(eta)), where omega <= xi < omega(1,) eta >= omega, Gamma is a countable set, X contains no complemented copy of l(1), X* has the Mazur property and the density character of X** is less than or equal to N(1).
Resumo:
Obesity has been recognized as a worldwide public health problem. It significantly increases the chances of developing several diseases, including Type II diabetes. The roles of insulin and leptin in obesity involve reactions that can be better understood when they are presented step by step. The aim of this work was to design software with data from some of the most recent publications on obesity, especially those concerning the roles of insulin and leptin in this metabolic disturbance. The most notable characteristic of this software is the use of animations representing the cellular response together with the presentation of recently discovered mechanisms on the participation of insulin and leptin in processes leading to obesity. The software was field tested in the Biochemistry of Nutrition web-based course. After using the software and discussing its contents in chatrooms, students were asked to answer an evaluation survey about the whole activity and the usefulness of the software within the learning process. The teaching assistants (TA) evaluated the software as a tool to help in the teaching process. The students' and TAs' satisfaction was very evident and encouraged us to move forward with the software development and to improve the use of this kind of educational tool in biochemistry classes.