594 resultados para Accelerating universes
Resumo:
In accelerating dark energy models, the estimates of the Hubble constant, Ho, from Sunyaev-Zerdovich effect (SZE) and X-ray surface brightness of galaxy clusters may depend on the matter content (Omega(M)), the curvature (Omega(K)) and the equation of state parameter GO. In this article, by using a sample of 25 angular diameter distances of galaxy clusters described by the elliptical beta model obtained through the SZE/X-ray technique, we constrain Ho in the framework of a general ACDM model (arbitrary curvature) and a flat XCDM model with a constant equation of state parameter omega = p(x)/rho(x). In order to avoid the use of priors in the cosmological parameters, we apply a joint analysis involving the baryon acoustic oscillations (BA()) and the (MB Shift Parameter signature. By taking into account the statistical and systematic errors of the SZE/X-ray technique we obtain for nonflat ACDM model H-0 = 74(-4.0)(+5.0) km s(-1) Mpc(-1) (1 sigma) whereas for a fiat universe with constant equation of state parameter we find H-0 = 72(-4.0)(+5.5) km s(-1) Mpc(-1)(1 sigma). By assuming that galaxy clusters are described by a spherical beta model these results change to H-0 = 6(-7.0)(+8.0) and H-0 = 59(-6.0)(+9.0) km s(-1) Mpc(-1)(1 sigma), respectively. The results from elliptical description are in good agreement with independent studies from the Hubble Space Telescope key project and recent estimates based on the Wilkinson Microwave Anisotropy Probe, thereby suggesting that the combination of these three independent phenomena provides an interesting method to constrain the Bubble constant. As an extra bonus, the adoption of the elliptical description is revealed to be a quite realistic assumption. Finally, by comparing these results with a recent determination for a, flat ACDM model using only the SZE/X-ray technique and BAO, we see that the geometry has a very weak influence on H-0 estimates for this combination of data.
Resumo:
Dense enough compact objects were recently shown to lead to an exponentially fast increase of the vacuum energy density for some free scalar fields properly coupled to the spacetime curvature as a consequence of a tachyonic-like instability. Once the effect is triggered, the star energy density would be overwhelmed by the vacuum energy density in a few milliseconds. This demands that eventually geometry and field evolve to a new configuration to bring the vacuum back to a stationary regime. Here, we show that the vacuum fluctuations built up during the unstable epoch lead to particle creation in the final stationary state when the tachyonic instability ceases. The amount of created particles depends mostly on the duration of the unstable epoch and final stationary configuration, which are open issues at this point. We emphasize that the particle creation coming from the tachyonic instability will occur even in the adiabatic limit, where the spacetime geometry changes arbitrarily slowly, and therefore is quite distinct from the usual particle creation due to the change in the background geometry.
Resumo:
Background: Accelerating bone healing around dental implants can reduce the long-term period between the insertion of implants and functional rehabilitation. Objective: This in vivo study evaluated the effect of a constant electromagnetic field (CEF) on bone healing around dental implants in dogs. Materials and methods: Eight dental implants were placed immediately after extraction of the first premolar and molar teeth on the mandible of two male dogs and divided into experimental (CEF) and control groups. A CEF at magnetic intensity of 0.8 mT with a pulse width of 25 mu s and frequency of 1.5 MHz was applied on the implants for 20 min per day for 2 weeks. Result and conclusion: After qualitative histological analysis, a small quantity of newly formed bone was observed in the gap between the implant surface and alveolar bone in both groups.
Resumo:
A Conferência Rio+20 mobiliza a comunidade global em 2012 para participar de um desafiador debate sobre a realidade ambiental global e modus operandi existente quanto à temática ampla e genérica do desenvolvimento e do ambiente. Um dos temas estruturantes desta reunião é a transição para uma economia verde no contexto do desenvolvimento sustentável e da erradicação da pobreza. O tema da Governança Ambiental Global um dos carros chefe do debate na Rio+20, no intuito de promover e acelerar a transição rumo a sociedades sustentáveis, configura a construção muitas vezes, de forma controversa, das condições para a definição de novos espaços institucionais e processos decisórios compartilhados. Este artigo propõe aos leitores uma reflexão para discutir que tipo de sustentabilidade está por trás da economia verde, a sua aplicabilidade e o que deva ser priorizada na discussão de governança ambiental. Isto se explica na medida em que existe a necessidade de mudar os mecanismos de utilização dos recursos, profundamente injustos, e que impedem avanços nos processos decisórios, pois as decisões de poucos tem configurado uma lógica perversa de expropriação de recursos naturais e não resolução da exclusão social.
Resumo:
The current cosmological dark sector (dark matter plus dark energy) is challenging our comprehension about the physical processes taking place in the Universe. Recently, some authors tried to falsify the basic underlying assumptions of such dark matterdark energy paradigm. In this Letter, we show that oversimplifications of the measurement process may produce false positives to any consistency test based on the globally homogeneous and isotropic ? cold dark matter (?CDM) model and its expansion history based on distance measurements. In particular, when local inhomogeneity effects due to clumped matter or voids are taken into account, an apparent violation of the basic assumptions (Copernican Principle) seems to be present. Conversely, the amplitude of the deviations also probes the degree of reliability underlying the phenomenological DyerRoeder procedure by confronting its predictions with the accuracy of the weak lensing approach. Finally, a new method is devised to reconstruct the effects of the inhomogeneities in a ?CDM model, and some suggestions of how to distinguish between clumpiness (or void) effects from different cosmologies are discussed.
Resumo:
The National Institute for Clinical Excellence (NICE) guidelines recommend the use of bare-metal stents (BMS) in non-complex lesions with a low risk of restenosis (diameter a parts per thousand yen3 mm and lesion length a parts per thousand currency sign15 mm) and the use of drug-eluting stents (DES) in more complex lesions with a high risk of restenosis (diameter < 3.0 mm or lesion length > 15 mm). However, the guidelines were created based on studies evaluating BMS and DES only. We performed an analysis of patients undergoing non-urgent percutaneous coronary intervention with the novel endothelial cell capturing stent (ECS). The ECS is coated with CD34(+) antibodies that attract circulating endothelial progenitor cells to the stent surface, thereby accelerating the endothelialization of the stented area. We analyzed all patients enrolled in the worldwide e-HEALING registry that met the NICE criteria for either low-risk or high-risk lesions and were treated with a parts per thousand yen1 ECS. The main study outcome was target vessel failure (TVF) at 12-month follow-up, defined as the composite of cardiac death or MI and target vessel revascularization (TVR). A total of 4,241 patients were assessed in the current analysis. At 12-month follow-up, TVF occurred in 7.0% of the patients with low-risk lesions and in 8.8% of the patients with high-risk lesions (p = 0.045). When evaluating the diabetic patients versus the non-diabetic patients per risk group, no significant differences were found in TVF, MI or TVR in either risk group. The ECS shows good clinical outcomes in lesions carrying either a high or a low risk of restenosis according to the NICE guidelines with comparable rates of cardiac death, myocardial infarction, and stent thrombosis. The TVF rate with ECS was slightly higher in patients with high-risk lesions, driven by higher clinically driven TLR. The risk of restenosis with ECS in patients carrying high-risk lesions needs to be carefully considered relative to other risks associated with DES. Furthermore, the presence of diabetes mellitus did not influence the incidence of TVF in either risk group.
Resumo:
We discuss the gravitational collapse of a spherically symmetric massive core of a star in which the fluid component is interacting with a growing vacuum energy density. The influence of the variable vacuum in the collapsing core is quantified by a phenomenological beta parameter as predicted by dimensional arguments and the renormalization group approach. For all reasonable values of this free parameter, we find that the vacuum energy density increases the collapsing time, but it cannot prevent the formation of a singular point. However, the nature of the singularity depends on the value of beta. In the radiation case, a trapped surface is formed for beta <= 1/2, whereas for beta >= 1/2, a naked singularity is developed. In general, the critical value is beta = 1-2/3(1 + omega) where omega is the parameter describing the equation of state of the fluid component.
Resumo:
After decades of successful hot big-bang paradigm, cosmology still lacks a framework in which the early inflationary phase of the universe smoothly matches the radiation epoch and evolves to the present “quasi” de Sitter spacetime. No less intriguing is that the current value of the effective vacuum energy density is vastly smaller than the value that triggered inflation. In this paper, we propose a new class of cosmologies capable of overcoming, or highly alleviating, some of these acute cosmic puzzles. Powered by a decaying vacuum energy density, the spacetime emerges from a pure nonsingular de Sitter vacuum stage, “gracefully” exits from inflation to a radiation phase followed by dark matter and vacuum regimes, and, finally, evolves to a late-time de Sitter phase.
Resumo:
[EN] BACKGROUND: A classic, unresolved physiological question is whether central cardiorespiratory and/or local skeletal muscle circulatory factors limit maximal aerobic capacity (VO2max) in humans. Severe heat stress drastically reduces VO2max, but the mechanisms have never been studied. METHODS AND RESULTS: To determine the main contributing factor that limits VO2max with and without heat stress, we measured hemodynamics in 8 healthy males performing intense upright cycling exercise until exhaustion starting with either high or normal skin and core temperatures (+10 degrees C and +1 degrees C). Heat stress reduced VO2max, 2-legged VO2, and time to fatigue by 0.4+/-0.1 L/min (8%), 0.5+/-0.2 L/min (11%), and 2.2+/-0.4 minutes (28%), respectively (all P<0.05), despite heart rate and core temperature reaching similar peak values. However, before exhaustion in both heat stress and normal conditions, cardiac output, leg blood flow, mean arterial pressure, and systemic and leg O2 delivery declined significantly (all 5% to 11%, P<0.05), yet arterial O2 content and leg vascular conductance remained unchanged. Despite increasing leg O2 extraction, leg VO2 declined 5% to 6% before exhaustion in both heat stress and normal conditions, accompanied by enhanced muscle lactate accumulation and ATP and creatine phosphate hydrolysis. CONCLUSIONS: These results demonstrate that in trained humans, severe heat stress reduces VO2max by accelerating the declines in cardiac output and mean arterial pressure that lead to decrements in exercising muscle blood flow, O2 delivery, and O2 uptake. Furthermore, the impaired systemic and skeletal muscle aerobic capacity that precedes fatigue with or without heat stress is largely related to the failure of the heart to maintain cardiac output and O2 delivery to locomotive muscle.
Resumo:
[ES] El turismo de cruceros ha reaparecido con fuerza desde los años ochenta, acelerándose su implantación en Europa -y en particular en Canarias-, desde la siguiente década. Gran parte de los estudios sobre el turismo de cruceros se han centrado en las características de la demanda (el perfil del turista, la capacidad de gasto, los impactos que causa, etc.). Sin embargo, la literatura sobre la percepción que tienen los residentes sobre este turismo es más bien escasa y donde se centra el actual estudio, en particular en el espacio más inmediato al Puerto de La Luz y de Las Palmas.[EN] Cruise tourism is a way of taking leisure time in our society. It is an activity that has become very popular since the eighties, accelerating its presence in Europe and, particularly,in the Canary Islands since the following decade. Many of the studies on cruise tourism have focused on the characteristics of the demand (including the profile of tourists, spending power, the impacts that this activity causes, etc.). However, the literature on the residents’ perception about this tourism where this study focuses on, is rather scarce is rather scarce, particularly in the space immediately to the Port of La Luz and Las Palmas.
Resumo:
This research argues for an analysis of textual and cultural forms in the American horror film (1968- 1998), by defining the so-called postmodern characters. The “postmodern” term will not mean a period of the history of cinema, but a series of forms and strategies recognizable in many American films. From a bipolar re-mediation and cognitive point of view, the postmodern phenomenon is been considered as a formal and epistemological re-configuration of the cultural “modern” system. The first section of the work examines theoretical problems about the “postmodern phenomenon” by defining its cultural and formal constants in different areas (epistemology, economy, mass-media): the character of convergence, fragmentation, manipulation and immersion represent the first ones, while the “excess” is the morphology of the change, by realizing the “fluctuation” of the previous consolidated system. The second section classifies the textual and cultural forms of American postmodern film, generally non-horror. The “classic narrative” structure – coherent and consequent chain of causal cues toward a conclusion – is scattered by the postmodern constant of “fragmentation”. New textual models arise, fragmenting the narrative ones into the aggregations of data without causal-temporal logics. Considering the process of “transcoding”1 and “remediation”2 between media, and the principle of “convergence” in the phenomenon, the essay aims to define these structures in postmodern film as “database forms” and “navigable space forms.” The third section applies this classification to American horror film (1968-1998). The formal constant of “excess” in the horror genre works on the paradigm of “vision”: if postmodern film shows a crisis of the “truth” in the vision, in horror movies the excess of vision becomes “hyper-vision” – that is “multiplication” of the death/blood/torture visions – and “intra-vision”, that shows the impossibility of recognizing the “real” vision from the virtual/imaginary. In this perspective, the textual and cultural forms and strategies of postmodern horror film are predominantly: the “database-accumulation” forms, where the events result from a very simple “remote cause” serving as a pretext (like in Night of the Living Dead); the “database-catalogue” forms, where the events follow one another displaying a “central” character or theme. In the first case, the catalogue syntagms are connected by “consecutive” elements, building stories linked by the actions of a single character (usually the killer), or connected by non-consecutive episodes about a general theme: examples of the first kind are built on the model of The Wizard of Gore; the second ones, on the films such as Mario Bava’s I tre volti della paura. The “navigable space” forms are defined: hyperlink a, where one universe is fluctuating between reality and dream, as in Rosemary’s Baby; hyperlink b (where two non-hierarchical universes are convergent, the first one real and the other one fictional, as in the Nightmare series); hyperlink c (where more worlds are separated but contiguous in the last sequence, as in Targets); the last form, navigable-loop, includes a textual line which suddenly stops and starts again, reflecting the pattern of a “loop” (as in Lost Highway). This essay analyses in detail the organization of “visual space” into the postmodern horror film by tracing representative patterns. It concludes by examining the “convergence”3 of technologies and cognitive structures of cinema and new media.
Resumo:
The miniaturization race in the hardware industry aiming at continuous increasing of transistor density on a die does not bring respective application performance improvements any more. One of the most promising alternatives is to exploit a heterogeneous nature of common applications in hardware. Supported by reconfigurable computation, which has already proved its efficiency in accelerating data intensive applications, this concept promises a breakthrough in contemporary technology development. Memory organization in such heterogeneous reconfigurable architectures becomes very critical. Two primary aspects introduce a sophisticated trade-off. On the one hand, a memory subsystem should provide well organized distributed data structure and guarantee the required data bandwidth. On the other hand, it should hide the heterogeneous hardware structure from the end-user, in order to support feasible high-level programmability of the system. This thesis work explores the heterogeneous reconfigurable hardware architectures and presents possible solutions to cope the problem of memory organization and data structure. By the example of the MORPHEUS heterogeneous platform, the discussion follows the complete design cycle, starting from decision making and justification, until hardware realization. Particular emphasis is made on the methods to support high system performance, meet application requirements, and provide a user-friendly programmer interface. As a result, the research introduces a complete heterogeneous platform enhanced with a hierarchical memory organization, which copes with its task by means of separating computation from communication, providing reconfigurable engines with computation and configuration data, and unification of heterogeneous computational devices using local storage buffers. It is distinguished from the related solutions by distributed data-flow organization, specifically engineered mechanisms to operate with data on local domains, particular communication infrastructure based on Network-on-Chip, and thorough methods to prevent computation and communication stalls. In addition, a novel advanced technique to accelerate memory access was developed and implemented.
Resumo:
Many potential diltiazem related L-VDCC blockers were developed using a multidisciplinary approach. This current study was to investigate and compare diltiazem with to the newly developed compounds by mouse Langendorff-perfused heart, Ca2+-transient and on recombinant L-VDCC. Twenty particular compounds were selected by the ligand-based virtual screening procedure (LBVS). From these compounds, five of them (5b, M2, M7, M8 and P1) showed a potent and selective inotropic activity on guinea-pig left atria driven 1 Hz. Further assays displayed an interesting negative inotropic effect of M2, M8, P1 and M7 on guinea pig isolated left papillary muscle driven at 1 Hz, a relevant vasorelaxant activity of 5b, M2, M7, M8 and P1 on K+-depolarized guinea-pig ileum longitudinal smooth muscle and a significant inhibition of contraction of 5b, M2, M8 and P1 on carbachol stimulated ileum longitudinal smooth muscle. Wild-type human heart and rabbit lung α1 subunits were expressed (combined with the regulatory α2δ and β3 subunits) in Xenopus Leavis oocytes using a two-electrode voltage clamp technique. Diltiazem is a benzothiazepine Ca2+ channel blocker used clinically for its antihypertensive and antiarrhythmic effects. Previous radioligand binding assays revealed a complex interaction with the benzothiazepine binding site for M2, M7 and M8. (Carosati E. et al. J. Med Chem. 2006, 49; 5206). In agreement with this findings, the relative order of increased rates of contraction and relaxation at lower concentrations s(≤10-6M) in unpaced hearts was M7>M2>M8>P1. Similar increases in Ca2+ transient were observed in cardiomyocytes. Diltiazem showed negative inotropic effects whereas 5b had no significant effect. Diltiazem blocks Ca2+current in a use-dependent manner and facilitates the channel by accelerating the inactivation and decelerating the recovery from inactivation. In contrast to diltiazem, the new analogs had no pronounced use-dependence. Application of 100 μM M8, M2 showed ~ 10% tonic block; in addition, M8, M2 and P1 shifted the steady state inactivation in hyperpolarized direction and the current inactivation time was significantly decreased compared with control (219.6 ± 11.5 ms, 226 ± 14.5 vs. 269 ± 12.9 vs. 199.28 ± 8.19 ms). Contrary to diltiazem, the recovery from the block by M8 and M2 was comparable to control. Only P1 showed a significantly decrease of the time for the recovery from inactivation. All of the compounds displayed the same sensitivity on the Ca2+ channel rabbit lung α1 except P1. Taken together, these findings suggest that M8, M2 and P1 might directly decrease the binding affinity or allow rapid dissociation from the benzothiazepine binding site.
Resumo:
The aim of this Thesis is to investigate the possibility that the observations related to the epoch of reionization can probe not only the evolution of the IGM state, but also the cosmological background in which this process occurs. In fact, the history of the IGM ionization is indeed affected by the evolution of the sources of ionizing photons that, under the assumption of a structure formation paradigm determined by the hierarchic growth of the matter uctuations, results strongly dependent on the characteristics of the background universe. For the purpose of our investigation, we have analysed the reionization history in innovative cosmological frameworks, still in agreement with the recent observational tests related to the SNIa and the CMB probes, comparing our results with the reionization scenario predicted by the commonly used LCDM cosmology. In particular, in this Thesis we have considered two different alternative universes. The first one is a at universe dominated at late epochs by a dynamic dark energy component, characterized by an equation of state evolving in time. The second cosmological framework we have assumed is a LCDM characterized by a primordial overdensity field having a non-Gaussian probability distribution. The reionization scenario have been investigated, in this Thesis, through semi-analytic approaches based on the hierarichic growth of the matter uctuations and on suitable assumptions concerning the ionization and the recombination of the IGM. We make predictions for the evolution and the distribution of the HII regions, and for the global features of reionization, that can be constrained by future observations. Finally, we brie y discuss the possible future prospects of this Thesis work.
Resumo:
The irrigation scheme Eduardo Mondlane, situated in Chókwè District - in the Southern part of the Gaza province and within the Limpopo River Basin - is the largest in the country, covering approximately 30,000 hectares of land. Built by the Portuguese colonial administration in the 1950s to exploit the agricultural potential of the area through cash-cropping, after Independence it became one of Frelimo’s flagship projects aiming at the “socialization of the countryside” and at agricultural economic development through the creation of a state farm and of several cooperatives. The failure of Frelimo’s economic reforms, several infrastructural constraints and local farmers resistance to collective forms of production led to scheme to a state of severe degradation aggravated by the floods of the year 2000. A project of technical rehabilitation initiated after the floods is currently accompanied by a strong “efficiency” discourse from the managing institution that strongly opposes the use of irrigated land for subsistence agriculture, historically a major livelihood strategy for smallfarmers, particularly for women. In fact, the area has been characterized, since the end of the XIX century, by a stable pattern of male migration towards South African mines, that has resulted in an a steady increase of women-headed households (both de jure and de facto). The relationship between land reform, agricultural development, poverty alleviation and gender equality in Southern Africa is long debated in academic literature. Within this debate, the role of agricultural activities in irrigation schemes is particularly interesting considering that, in a drought-prone area, having access to water for irrigation means increased possibilities of improving food and livelihood security, and income levels. In the case of Chókwè, local governments institutions are endorsing the development of commercial agriculture through initiatives such as partnerships with international cooperation agencies or joint-ventures with private investors. While these business models can sometimes lead to positive outcomes in terms of poverty alleviation, it is important to recognize that decentralization and neoliberal reforms occur in the context of financial and political crisis of the State that lacks the resources to efficiently manage infrastructures such as irrigation systems. This kind of institutional and economic reforms risk accelerating processes of social and economic marginalisation, including landlessness, in particular for poor rural women that mainly use irrigated land for subsistence production. The study combines an analysis of the historical and geographical context with the study of relevant literature and original fieldwork. Fieldwork was conducted between February and June 2007 (where I mainly collected secondary data, maps and statistics and conducted preliminary visit to Chókwè) and from October 2007 to March 2008. Fieldwork methodology was qualitative and used semi-structured interviews with central and local Government officials, technical experts of the irrigation scheme, civil society organisations, international NGOs, rural extensionists, and water users from the irrigation scheme, in particular those women smallfarmers members of local farmers’ associations. Thanks to the collaboration with the Union of Farmers’ Associations of Chókwè, she has been able to participate to members’ meeting, to education and training activities addressed to women farmers members of the Union and to organize a group discussion. In Chókwè irrigation scheme, women account for the 32% of water users of the familiar sector (comprising plot-holders with less than 5 hectares of land) and for just 5% of the private sector. If one considers farmers’ associations of the familiar sector (a legacy of Frelimo’s cooperatives), women are 84% of total members. However, the security given to them by the land title that they have acquired through occupation is severely endangered by the use that they make of land, that is considered as “non efficient” by the irrigation scheme authority. Due to a reduced access to marketing possibilities and to inputs, training, information and credit women, in actual fact, risk to see their right to access land and water revoked because they are not able to sustain the increasing cost of the water fee. The myth of the “efficient producer” does not take into consideration the characteristics of inequality and gender discrimination of the neo-liberal market. Expecting small-farmers, and in particular women, to be able to compete in the globalized agricultural market seems unrealistic, and can perpetuate unequal gendered access to resources such as land and water.