17 resultados para 240503 Thermodynamics and Statistical Physics

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two Boltzmannian accounts of the Second Law, viz.\ a globalist and a localist one. In both cases, the probabilities fail to be chances because they have rivals that are roughly equally good. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20\% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method for quantifying nociceptive withdrawal reflex receptive fields in human volunteers and patients is described. The reflex receptive field (RRF) for a specific muscle denotes the cutaneous area from which a muscle contraction can be evoked by a nociceptive stimulus. The method is based on random stimulations presented in a blinded sequence to 10 stimulation sites. The sensitivity map is derived by interpolating the reflex responses evoked from the 10 sites. A set of features describing the size and location of the RRF is presented based on statistical analysis of the sensitivity map within every subject. The features include RRF area, volume, peak location and center of gravity. The method was applied to 30 healthy volunteers. Electrical stimuli were applied to the sole of the foot evoking reflexes in the ankle flexor tibialis anterior. The RRF area covered a fraction of 0.57+/-0.06 (S.E.M.) of the foot and was located on the medial, distal part of the sole of the foot. An intramuscular injection into flexor digitorum brevis of capsaicin was performed in one spinal cord injured subject to attempt modulation of the reflex receptive field. The RRF area, RRF volume and location of the peak reflex response appear to be the most sensitive measures for detecting modulation of spinal nociceptive processing. This new method has important potential applications for exploring aspects of central plasticity in volunteers and patients. It may be utilized as a new diagnostic tool for central hypersensitivity and quantification of therapeutic interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE Intensity-modulated radiotherapy (IMRT) credentialing for a EORTC study was performed using an anthropomorphic head phantom from the Radiological Physics Center (RPC; RPC(PH)). Institutions were retrospectively requested to irradiate their institutional phantom (INST(PH)) using the same treatment plan in the framework of a Virtual Phantom Project (VPP) for IMRT credentialing. MATERIALS AND METHODS CT data set of the institutional phantom and measured 2D dose matrices were requested from centers and sent to a dedicated secure EORTC uploader. Data from the RPC(PH) and INST(PH) were thereafter centrally analyzed and inter-compared by the QA team using commercially available software (RIT; ver.5.2; Colorado Springs, USA). RESULTS Eighteen institutions participated to the VPP. The measurements of 6 (33%) institutions could not be analyzed centrally. All other centers passed both the VPP and the RPC ±7%/4 mm credentialing criteria. At the 5%/5 mm gamma criteria (90% of pixels passing), 11(92%) as compared to 12 (100%) centers pass the credentialing process with RPC(PH) and INST(PH) (p = 0.29), respectively. The corresponding pass rate for the 3%/3 mm gamma criteria (90% of pixels passing) was 2 (17%) and 9 (75%; p = 0.01), respectively. CONCLUSIONS IMRT dosimetry gamma evaluations in a single plane for a H&N prospective trial using the INST(PH) measurements showed agreement at the gamma index criteria of ±5%/5 mm (90% of pixels passing) for a small number of VPP measurements. Using more stringent, criteria, the RPC(PH) and INST(PH) comparison showed disagreement. More data is warranted and urgently required within the framework of prospective studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finite element (FE) analysis is an important computational tool in biomechanics. However, its adoption into clinical practice has been hampered by its computational complexity and required high technical competences for clinicians. In this paper we propose a supervised learning approach to predict the outcome of the FE analysis. We demonstrate our approach on clinical CT and X-ray femur images for FE predictions ( FEP), with features extracted, respectively, from a statistical shape model and from 2D-based morphometric and density information. Using leave-one-out experiments and sensitivity analysis, comprising a database of 89 clinical cases, our method is capable of predicting the distribution of stress values for a walking loading condition with an average correlation coefficient of 0.984 and 0.976, for CT and X-ray images, respectively. These findings suggest that supervised learning approaches have the potential to leverage the clinical integration of mechanical simulations for the treatment of musculoskeletal conditions.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The cyclotron laboratory for radioisotope production and multi-disciplinary research at the Bern University Hospital (Inselspital) is based on an 18-MeV proton accelerator, equipped with a specifically conceived 6-m long external beam line, ending in a separate bunker. This facility allows performing daily positron emission tomography (PET) radioisotope production and research activities running in parallel. Some of the latest developments on accelerator and detector physics are reported. They encompass novel detectors for beam monitoring and studies of low current beams.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We obtain upper bounds for the total variation distance between the distributions of two Gibbs point processes in a very general setting. Applications are provided to various well-known processes and settings from spatial statistics and statistical physics, including the comparison of two Lennard-Jones processes, hard core approximation of an area interaction process and the approximation of lattice processes by a continuous Gibbs process. Our proof of the main results is based on Stein's method. We construct an explicit coupling between two spatial birth-death processes to obtain Stein factors, and employ the Georgii-Nguyen-Zessin equation for the total bound.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hyper-Kamiokande will be a next generation underground water Cherenkov detector with a total (fiducial) mass of 0.99 (0.56) million metric tons, approximately 20 (25) times larger than that of Super-Kamiokande. One of the main goals of HyperKamiokande is the study of CP asymmetry in the lepton sector using accelerator neutrino and anti-neutrino beams. In this paper, the physics potential of a long baseline neutrino experiment using the Hyper-Kamiokande detector and a neutrino beam from the J-PARC proton synchrotron is presented. The analysis uses the framework and systematic uncertainties derived from the ongoing T2K experiment. With a total exposure of 7.5 MW × 10⁷ s integrated proton beam power (corresponding to 1.56 × 10²² protons on target with a 30 GeV proton beam) to a 2.5-degree off-axis neutrino beam, it is expected that the leptonic CP phase δCP can be determined to better than 19 degrees for all possible values of δCP , and CP violation can be established with a statistical significance of more than 3 σ (5 σ) for 76% (58%) of the δCP parameter space. Using both νe appearance and νµ disappearance data, the expected 1σ uncertainty of sin²θ₂₃ is 0.015(0.006) for sin²θ₂₃ = 0.5(0.45).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE OF REVIEW: Predicting asthma episodes is notoriously difficult but has potentially significant consequences for the individual, as well as for healthcare services. The purpose of this review is to describe recent insights into the prediction of acute asthma episodes in relation to classical clinical, functional or inflammatory variables, as well as present a new concept for evaluating asthma as a dynamically regulated homeokinetic system. RECENT FINDINGS: Risk prediction for asthma episodes or relapse has been attempted using clinical scoring systems, considerations of environmental factors and lung function, as well as inflammatory and immunological markers in induced sputum or exhaled air, and these are summarized here. We have recently proposed that newer mathematical methods derived from statistical physics may be used to understand the complexity of asthma as a homeokinetic, dynamic system consisting of a network comprising multiple components, and also to assess the risk for future asthma episodes based on fluctuation analysis of long time series of lung function. SUMMARY: Apart from the classical analysis of risk factor and functional parameters, this new approach may be used to assess asthma control and treatment effects in the individual as well as in future research trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although assessment of asthma control is important to guide treatment, it is difficult since the temporal pattern and risk of exacerbations are often unpredictable. In this Review, we summarise the classic methods to assess control with unidimensional and multidimensional approaches. Next, we show how ideas from the science of complexity can explain the seemingly unpredictable nature of bronchial asthma and emphysema, with implications for chronic obstructive pulmonary disease. We show that fluctuation analysis, a method used in statistical physics, can be used to gain insight into asthma as a dynamic disease of the respiratory system, viewed as a set of interacting subsystems (eg, inflammatory, immunological, and mechanical). The basis of the fluctuation analysis methods is the quantification of the long-term temporal history of lung function parameters. We summarise how this analysis can be used to assess the risk of future asthma episodes, with implications for asthma severity and control both in children and adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the sensitivity of large-scale xenon detectors to low-energy solar neutrinos, to coherent neutrino-nucleus scattering and to neutrinoless double beta decay. As a concrete example, we consider the xenon part of the proposed DARWIN (Dark Matter WIMP Search with Noble Liquids) experiment. We perform detailed Monte Carlo simulations of the expected backgrounds, considering realistic energy resolutions and thresholds in the detector. In a low-energy window of 2–30 keV, where the sensitivity to solar pp and 7Be-neutrinos is highest, an integrated pp-neutrino rate of 5900 events can be reached in a fiducial mass of 14 tons of natural xenon, after 5 years of data. The pp-neutrino flux could thus be measured with a statistical uncertainty around 1%, reaching the precision of solar model predictions. These low-energy solar neutrinos will be the limiting background to the dark matter search channel for WIMP-nucleon cross sections below ~2X 10-48 cm2 and WIMP masses around 50 GeV c 2, for an assumed 99.5% rejection of electronic recoils due to elastic neutrino-electron scatters. Nuclear recoils from coherent scattering of solar neutrinos will limit the sensitivity to WIMP masses below ~6 GeV c-2 to cross sections above ~4X10-45cm2. DARWIN could reach a competitive half-life sensitivity of 5.6X1026 y to the neutrinoless double beta decay of 136Xe after 5 years of data, using 6 tons of natural xenon in the central detector region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The next generation neutrino observatory proposed by the LBNO collaboration will address fundamental questions in particle and astroparticle physics. The experiment consists of a far detector, in its first stage a 20 kt LAr double phase TPC and a magnetised iron calorimeter, situated at 2300 km from CERN and a near detector based on a highpressure argon gas TPC. The long baseline provides a unique opportunity to study neutrino flavour oscillations over their 1st and 2nd oscillation maxima exploring the L/E behaviour, and distinguishing effects arising from δCP and matter. In this paper we have reevaluated the physics potential of this setup for determining the mass hierarchy (MH) and discovering CP-violation (CPV), using a conventional neutrino beam from the CERN SPS with a power of 750 kW. We use conservative assumptions on the knowledge of oscillation parameter priors and systematic uncertainties. The impact of each systematic error and the precision of oscillation prior is shown. We demonstrate that the first stage of LBNO can determine unambiguously the MH to > 5δ C.L. over the whole phase space. We show that the statistical treatment of the experiment is of very high importance, resulting in the conclusion that LBNO has ~ 100% probability to determine the MH in at most 4-5 years of running. Since the knowledge of MH is indispensable to extract δCP from the data, the first LBNO phase can convincingly give evidence for CPV on the 3δ C.L. using today’s knowledge on oscillation parameters and realistic assumptions on the systematic uncertainties.