982 resultados para predictions


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative computer tomography (QCT)-based finite element (FE) models of vertebral body provide better prediction of vertebral strength than dual energy X-ray absorptiometry. However, most models were validated against compression of vertebral bodies with endplates embedded in polymethylmethalcrylate (PMMA). Yet, loading being as important as bone density, the absence of intervertebral disc (IVD) affects the strength. Accordingly, the aim was to assess the strength predictions of the classic FE models (vertebral body embedded) against the in vitro and in silico strengths of vertebral bodies loaded via IVDs. High resolution peripheral QCT (HR-pQCT) were performed on 13 segments (T11/T12/L1). T11 and L1 were augmented with PMMA and the samples were tested under a 4° wedge compression until failure of T12. Specimen-specific model was generated for each T12 from the HR-pQCT data. Two FE sets were created: FE-PMMA refers to the classical vertebral body embedded model under axial compression; FE-IVD to their loading via hyperelastic IVD model under the wedge compression as conducted experimentally. Results showed that FE-PMMA models overestimated the experimental strength and their strength prediction was satisfactory considering the different experimental set-up. On the other hand, the FE-IVD models did not prove significantly better (Exp/FE-PMMA: R²=0.68; Exp/FE-IVD: R²=0.71, p=0.84). In conclusion, FE-PMMA correlates well with in vitro strength of human vertebral bodies loaded via real IVDs and FE-IVD with hyperelastic IVDs do not significantly improve this correlation. Therefore, it seems not worth adding the IVDs to vertebral body models until fully validated patient-specific IVD models become available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The finite element analysis is an accepted method to predict vertebral body compressive strength. This study compares measurements obtained from in vitro tests with the ones from two different simulation models: clinical quantitative computer tomography (QCT) based homogenized finite element (hFE) models and pre-clinical high-resolution peripheral QCT-based (HR-pQCT) hFE models. About 37 vertebral body sections were prepared by removing end-plates and posterior elements, scanned with QCT (390/450μm voxel size) as well as HR-pQCT (82μm voxel size), and tested in compression up to failure. Non-linear viscous damage hFE models were created from QCT/HT-pQCT images and compared to experimental results based on stiffness and ultimate load. As expected, the predictability of QCT/HR-pQCT-based hFE models for both apparent stiffness (r2=0.685/0.801r2=0.685/0.801) and strength (r2=0.774/0.924r2=0.774/0.924) increased if a better image resolution was used. An analysis of the damage distribution showed similar damage locations for all cases. In conclusion, HR-pQCT-based hFE models increased the predictability considerably and do not need any tuning of input parameters. In contrast, QCT-based hFE models usually need some tuning but are clinically the only possible choice at the moment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compared to μ→eγ and μ→eee, the process μ→e conversion in nuclei receives enhanced contributions from Higgs-induced lepton flavor violation. Upcoming μ→e conversion experiments with drastically increased sensitivity will be able to put extremely stringent bounds on Higgs-mediated μ→e transitions. We point out that the theoretical uncertainties associated with these Higgs effects, encoded in the couplings of quark scalar operators to the nucleon, can be accurately assessed using our recently developed approach based on SU(2) chiral perturbation theory that cleanly separates two- and three-flavor observables. We emphasize that with input from lattice QCD for the coupling to strangeness fNs, hadronic uncertainties are appreciably reduced compared to the traditional approach where fNs is determined from the pion-nucleon σ term by means of an SU(3) relation. We illustrate this point by considering Higgs-mediated lepton flavor violation in the standard model supplemented with higher-dimensional operators, the two-Higgs-doublet model with generic Yukawa couplings, and the minimal supersymmetric standard model. Furthermore, we compare bounds from present and future μ→e conversion and μ→eγ experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Weak radiative decays of the B mesons belong to the most important flavor changing processes that provide constraints on physics at the TeV scale. In the derivation of such constraints, accurate standard model predictions for the inclusive branching ratios play a crucial role. In the current Letter we present an update of these predictions, incorporating all our results for the O(α2s) and lower-order perturbative corrections that have been calculated after 2006. New estimates of nonperturbative effects are taken into account, too. For the CP- and isospin-averaged branching ratios, we find Bsγ=(3.36±0.23)×10−4 and Bdγ=(1.73+0.12−0.22)×10−5, for Eγ>1.6  GeV. Both results remain in agreement with the current experimental averages. Normalizing their sum to the inclusive semileptonic branching ratio, we obtain Rγ≡(Bsγ+Bdγ)/Bcℓν=(3.31±0.22)×10−3. A new bound from Bsγ on the charged Higgs boson mass in the two-Higgs-doublet-model II reads MH±>480  GeV at 95% C.L.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interaction of a comet with the solar wind undergoes various stages as the comet’s activity varies along its orbit. For a comet like 67P/Churyumov–Gerasimenko, the target comet of ESA’s Rosetta mission, the various features include the formation of a Mach cone, the bow shock, and close to perihelion even a diamagnetic cavity. There are different approaches to simulate this complex interplay between the solar wind and the comet’s extended neutral gas coma which include magnetohydrodynamics (MHD) and hybrid-type models. The first treats the plasma as fluids (one fluid in basic single fluid MHD) and the latter treats the ions as individual particles under the influence of the local electric and magnetic fields. The electrons are treated as a charge-neutralizing fluid in both cases. Given the different approaches both models yield different results, in particular for a low production rate comet. In this paper we will show that these differences can be reduced when using a multifluid instead of a single-fluid MHD model and increase the resolution of the Hybrid model. We will show that some major features obtained with a hybrid type approach like the gyration of the cometary heavy ions and the formation of the Mach cone can be partially reproduced with the multifluid-type model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cerebellum is the major brain structure that contributes to our ability to improve movements through learning and experience. We have combined computer simulations with behavioral and lesion studies to investigate how modification of synaptic strength at two different sites within the cerebellum contributes to a simple form of motor learning—Pavlovian conditioning of the eyelid response. These studies are based on the wealth of knowledge about the intrinsic circuitry and physiology of the cerebellum and the straightforward manner in which this circuitry is engaged during eyelid conditioning. Thus, our simulations are constrained by the well-characterized synaptic organization of the cerebellum and further, the activity of cerebellar inputs during simulated eyelid conditioning is based on existing recording data. These simulations have allowed us to make two important predictions regarding the mechanisms underlying cerebellar function, which we have tested and confirmed with behavioral studies. The first prediction describes the mechanisms by which one of the sites of synaptic modification, the granule to Purkinje cell synapses (gr → Pkj) of the cerebellar cortex, could generate two time-dependent properties of eyelid conditioning—response timing and the ISI function. An empirical test of this prediction using small, electrolytic lesions of the cerebellar cortex revealed the pattern of results predicted by the simulations. The second prediction made by the simulations is that modification of synaptic strength at the other site of plasticity, the mossy fiber to deep nuclei synapses (mf → nuc), is under the control of Purkinje cell activity. The analysis predicts that this property should confer mf → nuc synapses with resistance to extinction. Thus, while extinction processes erase plasticity at the first site, residual plasticity at mf → nuc synapses remains. The residual plasticity at the mf → nuc site confers the cerebellum with the capability for rapid relearning long after the learned behavior has been extinguished. We confirmed this prediction using a lesion technique that reversibly disconnected the cerebellar cortex at various stages during extinction and reacquisition of eyelid responses. The results of these studies represent significant progress toward a complete understanding of how the cerebellum contributes to motor learning. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary interest was in predicting the distribution runs in a sequence of Bernoulli trials. Difference equation techniques were used to express the number of runs of a given length k in n trials under three assumptions (1) no runs of length greater than k, (2) no runs of length less than k, (3) no other assumptions about the length of runs. Generating functions were utilized to obtain the distributions of the future number of runs, future number of minimum run lengths and future number of the maximum run lengths unconditional on the number of successes and failures in the Bernoulli sequence. When applying the model to Texas hydrology data, the model provided an adequate fit for the data in eight of the ten regions. Suggested health applications of this approach to run theory are provided. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The determination of the Stark broadening parameters of Sn ions is useful for astrophysicists interested in the determination of the density of electrons in stellar atmospheres. In this paper, we report on the calculated values of the Stark broadening parameters for 171 lines of Sn iii arising from 4d105sns (n= 6–9), 4d105snp (n= 5, 6), 4d105p2, 4d105snd (n= 5–7), 4d105s4f and 4d105s5g. Stark linewidths and line shifts are presented for an electron density of 1023 m−3 and temperatures T= 11 000–75 000 K. These have been calculated using a semi-empirical approach, with a set of wavefunctions obtained from Hartree–Fock relativistic calculations, including core polarization effects. The results obtained have been compared with available experimental data. These can be used to consider the influence of Stark broadening effects in A-type stellar atmospheres

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, challenged by the climate scenarios put forward by the IPCC and its potential impact on plant distribution, numerous predictive techniques -including the so called habitat suitability models (HSM)- have been developed. Yet, as the output of the different methods produces different distribution areas, developing validation tools are strong needs to reduce uncertainties. Focused in the Iberian Peninsula, we propose a palaeo-based method to increase the robustness of the HSM, by developing an ecological approach to understand the mismatches between the palaeoecological information and the projections of the HSMs. Here, we present the result of (1) investigating causal relationships between environmental variables and presence of Pinus sylvestris L. and P. nigra Arn. available from the 3rd Spanish Forest Inventory, (2) developing present and past presence-predictions through the MaxEnt model for 6 and 21 kyr BP, and (3) assessing these models through comparisons with biomized palaeoecological data available from the European Pollen Database for the Iberian Peninsula.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we introduce the idea of using a reliability measure associated to the predic- tions made by recommender systems based on collaborative filtering. This reliability mea- sure is based on the usual notion that the more reliable a prediction, the less liable to be wrong. Here we will define a general reliability measure suitable for any arbitrary recom- mender system. We will also show a method for obtaining specific reliability measures specially fitting the needs of different specific recommender systems.