998 resultados para massive gravitational models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past few decades detailed observations of radio and X-ray emission from massive binary systems revealed a whole new physics present in such systems. Both thermal and non-thermal components of this emission indicate that most of the radiation at these bands originates in shocks. O and B-type stars and WolfRayet (WR) stars present supersonic and massive winds that, when colliding, emit largely due to the freefree radiation. The non-thermal radio and X-ray emissions are due to synchrotron and inverse Compton processes, respectively. In this case, magnetic fields are expected to play an important role in the emission distribution. In the past few years the modelling of the freefree and synchrotron emissions from massive binary systems have been based on purely hydrodynamical simulations, and ad hoc assumptions regarding the distribution of magnetic energy and the field geometry. In this work we provide the first full magnetohydrodynamic numerical simulations of windwind collision in massive binary systems. We study the freefree emission characterizing its dependence on the stellar and orbital parameters. We also study self-consistently the evolution of the magnetic field at the shock region, obtaining also the synchrotron energy distribution integrated along different lines of sight. We show that the magnetic field in the shocks is larger than that obtained when the proportionality between B and the plasma density is assumed. Also, we show that the role of the synchrotron emission relative to the total radio emission has been underestimated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept behind a biodegradable ligament device is to temporarily replace the biomechanical functions of the ruptured ligament, while it progressively regenerates its capacities. However, there is a lack of methods to predict the mechanical behaviour evolution of the biodegradable devices during degradation, which is an important aspect of the project. In this work, a hyper elastic constitutive model will be used to predict the mechanical behaviour of a biodegradable rope made of aliphatic polyesters. A numerical approach using ABAQUS is presented, where the material parameters of the model proposal are automatically updated in correspondence to the degradation time, by means of a script in PYTHON. In this method we also use a User Material subroutine (UMAT) to apply a failure criterion base on the strength that decreases according to a first order differential equation. The parameterization of the material model proposal for different degradation times were achieved by fitting the theoretical curves with the experimental data of tensile tests on fibres. To model all the rope behaviour we had considered one step of homogenisation considering the fibres architectures in an elementary volume. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use the stacked gravitational lensingmass profile of four high-mass (M 1015M ) galaxy clusters around z≈0.3 from Umetsu et al. to fit density profiles of phenomenological [Navarro– Frenk–White (NFW), Einasto, S´ersic, Stadel, Baltz–Marshall–Oguri (BMO) and Hernquist] and theoretical (non-singular Isothermal Sphere, DARKexp and Kang & He) models of the dark matter distribution. We account for large-scale structure effects, including a two-halo term in the analysis.We find that the BMO model provides the best fit to the data as measured by the reduced χ2. It is followed by the Stadel profile, the generalized NFW profile with a free inner slope and by the Einasto profile. The NFW model provides the best fit if we neglect the two-halo term, in agreement with results from Umetsu et al. Among the theoretical profiles, the DARKexp model with a single form parameter has the best performance, very close to that of the BMO profile. This may indicate a connection between this theoretical model and the phenomenology of dark matter haloes, shedding light on the dynamical basis of empirical profiles which emerge from numerical simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neutron stars are some of the most fascinating objects in Nature. Essentially all aspects of physics seems to be represented inside them. Their cores are likely to contain deconfined quarks, hyperons and other exotic phases of matter in which the strong interaction is the dominant force. The inner region of their solid crust is penetrated by superfluid neutrons and their magnetic fields may reach well over 1012 Gauss. Moreover, their extreme mean densities, well above the densities of nuclei, and their rapid rotation rates makes them truly relativistic both in the special as well as in the general sense. This thesis deals with a small subset of these phenomena. In particular the exciting possibility of trapping of gravita-tional waves is examined from a theoretical point of view. It is shown that the standard condition R < 3M is not essential to the trapping mechanism. This point is illustrated using the elegant tool provided by the optical geometry. It is also shown that a realistic equation of state proposed in the literature allows stable neutron star models with closed circular null orbits, something which is closely related to trapped gravitational waves. Furthermore, the general relativistic theory of elasticity is reviewed and applied to stellar models. Both static equilibrium as well as radially oscillating configurations with elasticsources are examined. Finally, Killing tensors are considered and their applicability to modeling of stars is discussed

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood disasters are a major cause of fatalities and economic losses, and several studies indicate that global flood risk is currently increasing. In order to reduce and mitigate the impact of river flood disasters, the current trend is to integrate existing structural defences with non structural measures. This calls for a wider application of advanced hydraulic models for flood hazard and risk mapping, engineering design, and flood forecasting systems. Within this framework, two different hydraulic models for large scale analysis of flood events have been developed. The two models, named CA2D and IFD-GGA, adopt an integrated approach based on the diffusive shallow water equations and a simplified finite volume scheme. The models are also designed for massive code parallelization, which has a key importance in reducing run times in large scale and high-detail applications. The two models were first applied to several numerical cases, to test the reliability and accuracy of different model versions. Then, the most effective versions were applied to different real flood events and flood scenarios. The IFD-GGA model showed serious problems that prevented further applications. On the contrary, the CA2D model proved to be fast and robust, and able to reproduce 1D and 2D flow processes in terms of water depth and velocity. In most applications the accuracy of model results was good and adequate to large scale analysis. Where complex flow processes occurred local errors were observed, due to the model approximations. However, they did not compromise the correct representation of overall flow processes. In conclusion, the CA model can be a valuable tool for the simulation of a wide range of flood event types, including lowland and flash flood events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work considers the reconstruction of strong gravitational lenses from their observed effects on the light distribution of background sources. After reviewing the formalism of gravitational lensing and the most common and relevant lens models, new analytical results on the elliptical power law lens are presented, including new expressions for the deflection, potential, shear and magnification, which naturally lead to a fast numerical scheme for practical calculation. The main part of the thesis investigates lens reconstruction with extended sources by means of the forward reconstruction method, in which the lenses and sources are given by parametric models. The numerical realities of the problem make it necessary to find targeted optimisations for the forward method, in order to make it feasible for general applications to modern, high resolution images. The result of these optimisations is presented in the \textsc{Lensed} algorithm. Subsequently, a number of tests for general forward reconstruction methods are created to decouple the influence of sourced from lens reconstructions, in order to objectively demonstrate the constraining power of the reconstruction. The final chapters on lens reconstruction contain two sample applications of the forward method. One is the analysis of images from a strong lensing survey. Such surveys today contain $\sim 100$ strong lenses, and much larger sample sizes are expected in the future, making it necessary to quickly and reliably analyse catalogues of lenses with a fixed model. The second application deals with the opposite situation of a single observation that is to be confronted with different lens models, where the forward method allows for natural model-building. This is demonstrated using an example reconstruction of the ``Cosmic Horseshoe''. An appendix presents an independent work on the use of weak gravitational lensing to investigate theories of modified gravity which exhibit screening in the non-linear regime of structure formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

21 cm cosmology opens an observational window to previously unexplored cosmological epochs such as the Epoch of Reionization (EoR), the Cosmic Dawn and the Dark Ages using powerful radio interferometers such as the planned Square Kilometer Array (SKA). Among all the other applications which can potentially improve the understanding of standard cosmology, we study the promising opportunity given by measuring the weak gravitational lensing sourced by 21 cm radiation. We performed this study in two different cosmological epochs, at a typical EoR redshift and successively at a post-EoR redshift. We will show how the lensing signal can be reconstructed using a three dimensional optimal quadratic lensing estimator in Fourier space, using single frequency band or combining multiple frequency band measurements. To this purpose, we implemented a simulation pipeline capable of dealing with issues that can not be treated analytically. Considering the current SKA plans, we studied the performance of the quadratic estimator at typical EoR redshifts, for different survey strategies and comparing two thermal noise models for the SKA-Low array. The simulation we performed takes into account the beam of the telescope and the discreteness of visibility measurements. We found that an SKA-Low interferometer should obtain high-fidelity images of the underlying mass distribution in its phase 1 only if several bands are stacked together, covering a redshift range that goes from z=7 to z=11.5. The SKA-Low phase 2, modeled in order to improve the sensitivity of the instrument by almost an order of magnitude, should be capable of providing images with good quality even when the signal is detected within a single frequency band. Considering also the serious effect that foregrounds could have on this detections, we discussed the limits of these results and also the possibility provided by these models of measuring an accurate lensing power spectrum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Open collaborative projects are moving to the foreground of knowledge production. Some online user communities develop into longterm projects that generate a highly valuable and at the same time freely accessible output. Traditional copyright law that is organized around the idea of a single creative entity is not well equipped to accommodate the needs of these forms of collaboration. In order to enable a peculiar network-type of interaction participants instead draw on public licensing models that determine the freedoms to use individual contributions. With the help of these access rules the operational logic of the project can be implemented successfully. However, as the case of the Wikipedia GFDL-CC license transition demonstrates, the adaptation of access rules in networks to new circumstances raises collective action problems and suffers from pitfalls caused by the fact that public licensing is grounded in individual copyright. Legal governance of open collaboration projects is a largely unexplored field. The article argues that the license steward of a public license assumes the position of a fiduciary of the knowledge commons generated under the license regime. Ultimately, the governance of decentralized networks translates into a composite of organizational and contractual elements. It is concluded that the production of global knowledge commons relies on rules of transnational private law.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. According to the sequential accretion model (or core-nucleated accretion model), giant planet formation is based first on the formation of a solid core which, when massive enough, can gravitationally bind gas from the nebula to form the envelope. The most critical part of the model is the formation time of the core: to trigger the accretion of gas, the core has to grow up to several Earth masses before the gas component of the protoplanetary disc dissipates. Aims: We calculate planetary formation models including a detailed description of the dynamics of the planetesimal disc, taking into account both gas drag and excitation of forming planets. Methods: We computed the formation of planets, considering the oligarchic regime for the growth of the solid core. Embryos growing in the disc stir their neighbour planetesimals, exciting their relative velocities, which makes accretion more difficult. Here we introduce a more realistic treatment for the evolution of planetesimals' relative velocities, which directly impact on the formation timescale. For this, we computed the excitation state of planetesimals, as a result of stirring by forming planets, and gas-solid interactions. Results: We find that the formation of giant planets is favoured by the accretion of small planetesimals, as their random velocities are more easily damped by the gas drag of the nebula. Moreover, the capture radius of a protoplanet with a (tiny) envelope is also larger for small planetesimals. However, planets migrate as a result of disc-planet angular momentum exchange, with important consequences for their survival: due to the slow growth of a protoplanet in the oligarchic regime, rapid inward type I migration has important implications on intermediate-mass planets that have not yet started their runaway accretion phase of gas. Most of these planets are lost in the central star. Surviving planets have masses either below 10 M⊕ or above several Jupiter masses. Conclusions: To form giant planets before the dissipation of the disc, small planetesimals (~0.1 km) have to be the major contributors of the solid accretion process. However, the combination of oligarchic growth and fast inward migration leads to the absence of intermediate-mass planets. Other processes must therefore be at work to explain the population of extrasolar planets that are presently known.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. We study the link between gravitational slopes and the surface morphology on the nucleus of comet 67P/Churyumov-Gerasimenko and provide constraints on the mechanical properties of the cometary material (tensile, shear, and compressive strengths). Methods. We computed the gravitational slopes for five regions on the nucleus that are representative of the different morphologies observed on the surface (Imhotep, Ash, Seth, Hathor, and Agilkia), using two shape models computed from OSIRIS images by the stereo-photoclinometry (SPC) and stereo-photogrammetry (SPG) techniques. We estimated the tensile, shear, and compressive strengths using different surface morphologies (overhangs, collapsed structures, boulders, cliffs, and Philae's footprint) and mechanical considerations. Results. The different regions show a similar general pattern in terms of the relation between gravitational slopes and terrain morphology: i) low-slope terrains (0-20 degrees) are covered by a fine material and contain a few large (>10 m) and isolated boulders; ii) intermediate-slope terrains (20-45 degrees) are mainly fallen consolidated materials and debris fields, with numerous intermediate-size boulders from <1m to 10m for the majority of them; and iii) high-slope terrains (45-90 degrees) are cliffs that expose a consolidated material and do not show boulders or fine materials. The best range for the tensile strength of overhangs is 3-15 Pa (upper limit of 150 Pa), 4-30 Pa for the shear strength of fine surface materials and boulders, and 30-150 Pa for the compressive strength of overhangs (upper limit of 1500 Pa). The strength-to-gravity ratio is similar for 67P and weak rocks on Earth. As a result of the low compressive strength, the interior of the nucleus may have been compressed sufficiently to initiate diagenesis, which could have contributed to the formation of layers. Our value for the tensile strength is comparable to that of dust aggregates formed by gravitational instability and tends to favor a formation of comets by the accrection of pebbles at low velocities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. We extend the results of planetary formation synthesis by computing the long-term evolution of synthetic systems from the clearing of the gas disk into the dynamical evolution phase. Methods. We use the symplectic integrator SyMBA to numerically integrate the orbits of planets for 100 Myr, using populations from previous studies as initial conditions. Results. We show that within the populations studied, mass and semimajor axis distributions experience only minor changes from post-formation evolution. We also show that, depending upon their initial distribution, planetary eccentricities can statistically increase or decrease as a result of gravitational interactions. We find that planetary masses and orbital spacings provided by planet formation models do not result in eccentricity distributions comparable to observed exoplanet eccentricities, requiring other phenomena, such as stellar fly-bys, to account for observed eccentricities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maximizing data quality may be especially difficult in trauma-related clinical research. Strategies are needed to improve data quality and assess the impact of data quality on clinical predictive models. This study had two objectives. The first was to compare missing data between two multi-center trauma transfusion studies: a retrospective study (RS) using medical chart data with minimal data quality review and the PRospective Observational Multi-center Major Trauma Transfusion (PROMMTT) study with standardized quality assurance. The second objective was to assess the impact of missing data on clinical prediction algorithms by evaluating blood transfusion prediction models using PROMMTT data. RS (2005-06) and PROMMTT (2009-10) investigated trauma patients receiving ≥ 1 unit of red blood cells (RBC) from ten Level I trauma centers. Missing data were compared for 33 variables collected in both studies using mixed effects logistic regression (including random intercepts for study site). Massive transfusion (MT) patients received ≥ 10 RBC units within 24h of admission. Correct classification percentages for three MT prediction models were evaluated using complete case analysis and multiple imputation based on the multivariate normal distribution. A sensitivity analysis for missing data was conducted to estimate the upper and lower bounds of correct classification using assumptions about missing data under best and worst case scenarios. Most variables (17/33=52%) had <1% missing data in RS and PROMMTT. Of the remaining variables, 50% demonstrated less missingness in PROMMTT, 25% had less missingness in RS, and 25% were similar between studies. Missing percentages for MT prediction variables in PROMMTT ranged from 2.2% (heart rate) to 45% (respiratory rate). For variables missing >1%, study site was associated with missingness (all p≤0.021). Survival time predicted missingness for 50% of RS and 60% of PROMMTT variables. MT models complete case proportions ranged from 41% to 88%. Complete case analysis and multiple imputation demonstrated similar correct classification results. Sensitivity analysis upper-lower bound ranges for the three MT models were 59-63%, 36-46%, and 46-58%. Prospective collection of ten-fold more variables with data quality assurance reduced overall missing data. Study site and patient survival were associated with missingness, suggesting that data were not missing completely at random, and complete case analysis may lead to biased results. Evaluating clinical prediction model accuracy may be misleading in the presence of missing data, especially with many predictor variables. The proposed sensitivity analysis estimating correct classification under upper (best case scenario)/lower (worst case scenario) bounds may be more informative than multiple imputation, which provided results similar to complete case analysis.^