970 resultados para Black box approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We solve Einstein equations on the brane to derive the exact form of the brane-world-corrected perturbations in Kerr-Newman singularities, using Randall-Sundrum and Arkani-Hamed-Dimopoulos-Dvali (ADD) models. It is a consequence of such models that Kerr-Newman mini-black holes can be produced in LHC. We use this approach to derive a normalized correction for the Schwarzschild Myers-Perry radius of a static (4+n)-dimensional mini-black hole, using more realistic approaches arising from Kerr-Newman mini-black hole analysis. Besides, we prove that there are four Kerr-Newman black hole horizons in the brane-world scenario we use, although only the outer horizon is relevant in the physical measurable processes. Parton cross sections in LHC and Hawking temperature are also investigated as functions of Planck mass (in the LHC range 1-10 TeV), mini-black hole mass, and the number of large extra dimensions in brane-world large extra-dimensional scenarios. In this case a more realistic brane-effect-corrected formalism can achieve more precisely the effective extra-dimensional Planck mass and the number of large extra dimensions-in the Arkani-Hamed-Dimopoulos-Dvali model-or the size of the warped extra dimension-in Randall-Sundrum formalism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we develop an approach to obtain analytical expressions for potentials in an impenetrable box. In this kind of system the expression has the advantage of being valid for arbitrary values of the box length, and respect the correct quantum limits. The similarity of this kind of problem with the quasi exactly solvable potentials is explored in order to accomplish our goals. Problems related to the break of symmetries and simultaneous eigenfunctions of commuting operators are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies have been carried out on the heat transfer in a packed bed of glass beads percolated by air at moderate flow rates. Rigorous statistic analysis of the experimental data was carried out and the traditional two parameter model was used to represent them. The parameters estimated were the effective radial thermal conductivity, k, and the wall coefficient, h, through the least squares method. The results were evaluated as to the boundary bed inlet temperature, T-o, number of terms of the solution series and number of experimental points used in the estimate. Results indicated that a small difference in T-o was sufficient to promote great modifications in the estimated parameters and in the statistical properties of the model. The use of replicas at points of high parametric information of the model improved the results, although analysis of the residuals has resulted in the rejection of this alternative. In order to evaluate cion-linearity of the model, Bates and Watts (1988) curvature measurements and the Box (1971) biases of the coefficients were calculated. The intrinsic curvatures of the model (IN) tend to be concentrated at low bed heights and those due to parameter effects (PE) are spread all over the bed. The Box biases indicated both parameters as responsible for the curvatures PE, h being somewhat more problematic. (C) 2000 Elsevier B.V. Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predation is a primary driver of tadpole assemblages, and the activity rate is a good predictor of the tadpoles' tolerance for predation risk. The conflicting demands between activity and exposure to predation can generate suboptimal behaviours. Because morphological components, such as body colouration, may affect the activity of tadpoles, we predict that environmental features that enhance or match the tadpole colouration should affect their survival or activity rate in the presence of a predator. We tested this prediction experimentally by assessing the mortality rate of tadpoles of Rhinella schneideri and Eupemphix nattereri and the active time on two artificial background types: one bright-coloured and one black-coloured. We found no difference in tadpole mortality due to the background type. However, R. schneideri tadpoles were more active than E. nattereri tadpoles, and the activity of R. schneideri was reduced less in the presence of the predator than that of E. nattereri. Although the background colouration did not affect the tadpole mortality rate, it was a stimulus that elicited behavioural responses in the tadpoles, leading them to adjust their activity rate to the type of background colour. © 2013 Dipartimento di Biologia, Università degli Studi di Firenze, Italia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this descriptive-exploratory study was to identify the health beliefs of black individuals with hypertension regarding the barriers and benefits of diet for controlling the disease, including the sociodemographic factors associated with the health beliefs surrounding diet control. One hundred and six black adults with hypertension were interviewed using a specific instrument. The data were analyzed considering the percentages, frequency of the cases, scores and prevalence ratio. The global analysis of beliefs showed a preponderance of beliefs regarding the benefits of diet control. It was observed that men, younger individuals, lack of a partner and low educational level and income were related to the beliefs regarding the benefits of adopting a healthy diet. In conclusion, health promotion among the black population requires an interdisciplinary approach and specific health policies addressing this populations' needs, aimed at preventive and curative aspects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We show that a single imperfect fluid can be used as a source to obtain a mass-varying black hole in an expanding universe. This approach generalizes the well-known McVittie spacetime, by allowing the mass to vary thanks to a novel mechanism based on the presence of a temperature gradient. This fully dynamical solution, which does not require phantom fields or fine-tuning, is a step forward in a new direction in the study of systems whose local gravitational attraction is coupled to the expansion of the universe. We present a simple but instructive example for the mass function and briefly discuss the structure of the apparent horizons and the past singularity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the gravitational collapse of a spherically symmetric massive core of a star in which the fluid component is interacting with a growing vacuum energy density. The influence of the variable vacuum in the collapsing core is quantified by a phenomenological beta parameter as predicted by dimensional arguments and the renormalization group approach. For all reasonable values of this free parameter, we find that the vacuum energy density increases the collapsing time, but it cannot prevent the formation of a singular point. However, the nature of the singularity depends on the value of beta. In the radiation case, a trapped surface is formed for beta <= 1/2, whereas for beta >= 1/2, a naked singularity is developed. In general, the critical value is beta = 1-2/3(1 + omega) where omega is the parameter describing the equation of state of the fluid component.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background Despite evidence that health and disease occur in social contexts, the vast majority of studies addressing dental pain exclusively assessed information gathered at individual level. Objectives To assess the association between dental pain and contextual and individual characteristics in Brazilian adolescents. In addition, we aimed to test whether contextual Human Development Index is independently associated with dental pain after adjusting for individual level variables of socio-demographics and dental characteristics. Methods The study used data from an oral health survey carried out in São Paulo, Brazil, which included dental pain, dental exams, individual socioeconomic and demographic conditions, and Human Development Index at area level of 4,249 12-year-old and 1,566 15-year-old schoolchildren. The Poisson multilevel analysis was performed. Results Dental pain was found among 25.6% (95%CI = 24.5-26.7) of the adolescents and was 33% less prevalent among those living in more developed areas of the city than among those living in less developed areas. Girls, blacks, those whose parents earn low income and have low schooling, those studying at public schools, and those with dental treatment needs presented higher dental-pain prevalence than their counterparts. Area HDI remained associated with dental pain after adjusting for individual level variables of socio demographic and dental characteristics. Conclusions Girls, students whose parents have low schooling, those with low per capita income, those classified as having black skin color and those with dental treatment needs had higher dental pain prevalence than their counterparts. Students from areas with low Human Development Index had higher prevalence of dental pain than those from the more developed areas regardless of individual characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we aim to propose a new approach for preliminary epidemiological studies on Standardized Mortality Ratios (SMR) collected in many spatial regions. A preliminary study on SMRs aims to formulate hypotheses to be investigated via individual epidemiological studies that avoid bias carried on by aggregated analyses. Starting from collecting disease counts and calculating expected disease counts by means of reference population disease rates, in each area an SMR is derived as the MLE under the Poisson assumption on each observation. Such estimators have high standard errors in small areas, i.e. where the expected count is low either because of the low population underlying the area or the rarity of the disease under study. Disease mapping models and other techniques for screening disease rates among the map aiming to detect anomalies and possible high-risk areas have been proposed in literature according to the classic and the Bayesian paradigm. Our proposal is approaching this issue by a decision-oriented method, which focus on multiple testing control, without however leaving the preliminary study perspective that an analysis on SMR indicators is asked to. We implement the control of the FDR, a quantity largely used to address multiple comparisons problems in the eld of microarray data analysis but which is not usually employed in disease mapping. Controlling the FDR means providing an estimate of the FDR for a set of rejected null hypotheses. The small areas issue arises diculties in applying traditional methods for FDR estimation, that are usually based only on the p-values knowledge (Benjamini and Hochberg, 1995; Storey, 2003). Tests evaluated by a traditional p-value provide weak power in small areas, where the expected number of disease cases is small. Moreover tests cannot be assumed as independent when spatial correlation between SMRs is expected, neither they are identical distributed when population underlying the map is heterogeneous. The Bayesian paradigm oers a way to overcome the inappropriateness of p-values based methods. Another peculiarity of the present work is to propose a hierarchical full Bayesian model for FDR estimation in testing many null hypothesis of absence of risk.We will use concepts of Bayesian models for disease mapping, referring in particular to the Besag York and Mollié model (1991) often used in practice for its exible prior assumption on the risks distribution across regions. The borrowing of strength between prior and likelihood typical of a hierarchical Bayesian model takes the advantage of evaluating a singular test (i.e. a test in a singular area) by means of all observations in the map under study, rather than just by means of the singular observation. This allows to improve the power test in small areas and addressing more appropriately the spatial correlation issue that suggests that relative risks are closer in spatially contiguous regions. The proposed model aims to estimate the FDR by means of the MCMC estimated posterior probabilities b i's of the null hypothesis (absence of risk) for each area. An estimate of the expected FDR conditional on data (\FDR) can be calculated in any set of b i's relative to areas declared at high-risk (where thenull hypothesis is rejected) by averaging the b i's themselves. The\FDR can be used to provide an easy decision rule for selecting high-risk areas, i.e. selecting as many as possible areas such that the\FDR is non-lower than a prexed value; we call them\FDR based decision (or selection) rules. The sensitivity and specicity of such rule depend on the accuracy of the FDR estimate, the over-estimation of FDR causing a loss of power and the under-estimation of FDR producing a loss of specicity. Moreover, our model has the interesting feature of still being able to provide an estimate of relative risk values as in the Besag York and Mollié model (1991). A simulation study to evaluate the model performance in FDR estimation accuracy, sensitivity and specificity of the decision rule, and goodness of estimation of relative risks, was set up. We chose a real map from which we generated several spatial scenarios whose counts of disease vary according to the spatial correlation degree, the size areas, the number of areas where the null hypothesis is true and the risk level in the latter areas. In summarizing simulation results we will always consider the FDR estimation in sets constituted by all b i's selected lower than a threshold t. We will show graphs of the\FDR and the true FDR (known by simulation) plotted against a threshold t to assess the FDR estimation. Varying the threshold we can learn which FDR values can be accurately estimated by the practitioner willing to apply the model (by the closeness between\FDR and true FDR). By plotting the calculated sensitivity and specicity (both known by simulation) vs the\FDR we can check the sensitivity and specicity of the corresponding\FDR based decision rules. For investigating the over-smoothing level of relative risk estimates we will compare box-plots of such estimates in high-risk areas (known by simulation), obtained by both our model and the classic Besag York Mollié model. All the summary tools are worked out for all simulated scenarios (in total 54 scenarios). Results show that FDR is well estimated (in the worst case we get an overestimation, hence a conservative FDR control) in small areas, low risk levels and spatially correlated risks scenarios, that are our primary aims. In such scenarios we have good estimates of the FDR for all values less or equal than 0.10. The sensitivity of\FDR based decision rules is generally low but specicity is high. In such scenario the use of\FDR = 0:05 or\FDR = 0:10 based selection rule can be suggested. In cases where the number of true alternative hypotheses (number of true high-risk areas) is small, also FDR = 0:15 values are well estimated, and \FDR = 0:15 based decision rules gains power maintaining an high specicity. On the other hand, in non-small areas and non-small risk level scenarios the FDR is under-estimated unless for very small values of it (much lower than 0.05); this resulting in a loss of specicity of a\FDR = 0:05 based decision rule. In such scenario\FDR = 0:05 or, even worse,\FDR = 0:1 based decision rules cannot be suggested because the true FDR is actually much higher. As regards the relative risk estimation, our model achieves almost the same results of the classic Besag York Molliè model. For this reason, our model is interesting for its ability to perform both the estimation of relative risk values and the FDR control, except for non-small areas and large risk level scenarios. A case of study is nally presented to show how the method can be used in epidemiology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we are concerned with the analysis and numerical solution of Black-Scholes type equations arising in the modeling of incomplete financial markets and an inverse problem of determining the local volatility function in a generalized Black-Scholes model from observed option prices. In the first chapter a fully nonlinear Black-Scholes equation which models transaction costs arising in option pricing is discretized by a new high order compact scheme. The compact scheme is proved to be unconditionally stable and non-oscillatory and is very efficient compared to classical schemes. Moreover, it is shown that the finite difference solution converges locally uniformly to the unique viscosity solution of the continuous equation. In the next chapter we turn to the calibration problem of computing local volatility functions from market data in a generalized Black-Scholes setting. We follow an optimal control approach in a Lagrangian framework. We show the existence of a global solution and study first- and second-order optimality conditions. Furthermore, we propose an algorithm that is based on a globalized sequential quadratic programming method and a primal-dual active set strategy, and present numerical results. In the last chapter we consider a quasilinear parabolic equation with quadratic gradient terms, which arises in the modeling of an optimal portfolio in incomplete markets. The existence of weak solutions is shown by considering a sequence of approximate solutions. The main difficulty of the proof is to infer the strong convergence of the sequence. Furthermore, we prove the uniqueness of weak solutions under a smallness condition on the derivatives of the covariance matrices with respect to the solution, but without additional regularity assumptions on the solution. The results are illustrated by a numerical example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This PhD Thesis is devoted to the accurate analysis of the physical properties of Active Galactic Nuclei (AGN) and the AGN/host-galaxy interplay. Due to the broad-band AGN emission (from radio to hard X-rays), a multi-wavelength approach is mandatory. Our research is carried out over the COSMOS field, within the context of the XMM-Newton wide-field survey. To date, the COSMOS field is a unique area for comprehensive multi-wavelength studies, allowing us to define a large and homogeneous sample of QSOs with a well-sampled spectral coverage and to keep selection effects under control. Moreover, the broad-band information contained in the COSMOS database is well-suited for a detailed analysis of AGN SEDs, bolometric luminosities and bolometric corrections. In order to investigate the nature of both obscured (Type-2) and unobscured (Type-1) AGN, the observational approach is complemented with a theoretical modelling of the AGN/galaxy co-evolution. The X-ray to optical properties of an X-ray selected Type-1 AGN sample are discussed in the first part. The relationship between X-ray and optical/UV luminosities, parametrized by the spectral index αox, provides a first indication about the nature of the central engine powering the AGN. Since a Type-1 AGN outshines the surrounding environment, it is extremely difficult to constrain the properties of its host-galaxy. Conversely, in Type-2 AGN the host-galaxy light is the dominant component of the optical/near-IR SEDs, severely affecting the recovery of the intrinsic AGN emission. Hence a multi-component SED-fitting code is developed to disentangle the emission of the stellar populationof the galaxy from that associated with mass accretion. Bolometric corrections, luminosities, stellar masses and star-formation rates, correlated with the morphology of Type-2 AGN hosts, are presented in the second part, while the final part concerns a physically-motivated model for the evolution of spheroidal galaxies with a central SMBH. The model is able to reproduce two important stages of galaxy evolution, namely the obscured cold-phase and the subsequent quiescent hot-phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tethered bilayer lipid membranes (tBLMs) are a promising model system for the natural cell membrane. They consist of a lipid bilayer that is covalently coupled to a solid support via a spacer group. In this study, we developed a suitable approach to increase the submembrane space in tBLMs. The challenge is to create a membrane with a lower lipid density in order to increase the membrane fluidity, but to avoid defects that might appear due to an increase in the lateral space within the tethered monolayers. Therefore, various synthetic strategies and different monolayer preparation techniques were examined. Synthetical attempts to achieve a large ion reservoir were made in two directions: increasing the spacer length of the tether lipids and increasing the lateral distribution of the lipids in the monolayer. The first resulted in the synthesis of a small library of tether lipids (DPTT, DPHT and DPOT) characterized by 1H and 13C NMR, FD-MS, ATR, DSC and TGA. The synthetic strategy for their preparation includes synthesis of precursor with a double bond anchor that can be easily modified for different substrates (e.g. metal and metaloxide). Here, the double bond was modified into a thiol group suitable for gold surface. Another approach towards the preparation of homogeneous monolayers with decreased two-dimensional packing density was the synthesis of two novel anchor lipids: DPHDL and DDPTT. DPHDL is “self-diluted” tether lipid containing two lipoic anchor moieties. DDPTT has an extended lipophylic part that should lead to the preparation of diluted, leakage free proximal layers that will facilitate the completion of the bilayer. Our tool-box of tether lipids was completed with two fluorescent labeled lipid precursors with respectively one and two phytanyl chains in the hydrophobic region and a dansyl group as a fluorophore. The use of such fluorescently marked lipids is supposed to give additional information for the lipid distribution on the air-water interface. The Langmuir film balance was used to investigate the monolayer properties of four of the synthesized thiolated anchor lipids. The packing density and mixing behaviour were examined. The results have shown that mixing anchor with free lipids can homogeneously dilute the anchor lipid monolayers. Moreover, an increase in the hydrophylicity (PEG chain length) of the anchor lipids leads to a higher packing density. A decrease in the temperature results in a similar trend. However, increasing the number of phytanyl chains per lipid molecule is shown to decrease the packing density. LB-monolayers based on pure and mixed lipids in different ratio and transfer pressure were tested to form tBLMs with diluted inner layers. A combination of the LB-monolayer transfer with the solvent exchange method accomplished successfully the formation of tBLMs based on pure DPOT. Some preliminary investigations of the electrical sealing properties and protein incorporation of self-assembled DPOT and DDPTT-based tBLMs were conducted. The bilayer formation performed by solvent exchange resulted in membranes with high resistances and low capacitances. The appearance of space beneath the membrane is clearly visible in the impedance spectra expressed by a second RC element. The latter brings the conclusion that the longer spacer in DPOT and the bigger lateral space between the DDPTT molecules in the investigated systems essentially influence the electrical parameters of the membrane. Finally, we could show the functional incorporation of the small ion carrier valinomycin in both types of membranes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using findings from a qualitative investigation based on in-depth email interviews with 47 Black and South Asian gay men in Britain, this paper explores the cross-cutting identities and discourses in relation to being both gay and from an ethnic minority background. Taking an intersectional approach, detailed accounts of identity negotiation, cultural pressures, experiences of discrimination and exclusion and the relationship between minority ethnic gay men and mainstream White gay culture are presented and explored. The major findings common to both groups were: cultural barriers limiting disclosure of sexuality to family and wider social networks; experiences of discrimination by White gay men that included exclusion as well as objectification; a lack of positive gay role models and imagery relating to men from minority ethnic backgrounds. Among South Asian gay men, a major theme was regret at being unable to fulfil family expectations regarding marriage and children, while among Black gay men, there was a strong belief that same-sex behaviour subverted cultural notions related to how masculinity is configured. The paper concludes by highlighting the importance of social location, particularly education and income, when examining the intersection of ethnicity and sexuality in future research.