899 resultados para The Studio Model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A field of computational neuroscience develops mathematical models to describe neuronal systems. The aim is to better understand the nervous system. Historically, the integrate-and-fire model, developed by Lapique in 1907, was the first model describing a neuron. In 1952 Hodgkin and Huxley [8] described the so called Hodgkin-Huxley model in the article “A Quantitative Description of Membrane Current and Its Application to Conduction and Excitation in Nerve”. The Hodgkin-Huxley model is one of the most successful and widely-used biological neuron models. Based on experimental data from the squid giant axon, Hodgkin and Huxley developed their mathematical model as a four-dimensional system of first-order ordinary differential equations. One of these equations characterizes the membrane potential as a process in time, whereas the other three equations depict the opening and closing state of sodium and potassium ion channels. The membrane potential is proportional to the sum of ionic current flowing across the membrane and an externally applied current. For various types of external input the membrane potential behaves differently. This thesis considers the following three types of input: (i) Rinzel and Miller [15] calculated an interval of amplitudes for a constant applied current, where the membrane potential is repetitively spiking; (ii) Aihara, Matsumoto and Ikegaya [1] said that dependent on the amplitude and the frequency of a periodic applied current the membrane potential responds periodically; (iii) Izhikevich [12] stated that brief pulses of positive and negative current with different amplitudes and frequencies can lead to a periodic response of the membrane potential. In chapter 1 the Hodgkin-Huxley model is introduced according to Izhikevich [12]. Besides the definition of the model, several biological and physiological notes are made, and further concepts are described by examples. Moreover, the numerical methods to solve the equations of the Hodgkin-Huxley model are presented which were used for the computer simulations in chapter 2 and chapter 3. In chapter 2 the statements for the three different inputs (i), (ii) and (iii) will be verified, and periodic behavior for the inputs (ii) and (iii) will be investigated. In chapter 3 the inputs are embedded in an Ornstein-Uhlenbeck process to see the influence of noise on the results of chapter 2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, the well-known MC code FLUKA was used to simulate the GE PETrace cyclotron (16.5 MeV) installed at “S. Orsola-Malpighi” University Hospital (Bologna, IT) and routinely used in the production of positron emitting radionuclides. Simulations yielded estimates of various quantities of interest, including: the effective dose distribution around the equipment; the effective number of neutron produced per incident proton and their spectral distribution; the activation of the structure of the cyclotron and the vault walls; the activation of the ambient air, in particular the production of 41Ar, the assessment of the saturation yield of radionuclides used in nuclear medicine. The simulations were validated against experimental measurements in terms of physical and transport parameters to be used at the energy range of interest in the medical field. The validated model was also extensively used in several practical applications uncluding the direct cyclotron production of non-standard radionuclides such as 99mTc, the production of medical radionuclides at TRIUMF (Vancouver, CA) TR13 cyclotron (13 MeV), the complete design of the new PET facility of “Sacro Cuore – Don Calabria” Hospital (Negrar, IT), including the ACSI TR19 (19 MeV) cyclotron, the dose field around the energy selection system (degrader) of a proton therapy cyclotron, the design of plug-doors for a new cyclotron facility, in which a 70 MeV cyclotron will be installed, and the partial decommissioning of a PET facility, including the replacement of a Scanditronix MC17 cyclotron with a new TR19 cyclotron.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The uncertainties in the determination of the stratigraphic profile of natural soils is one of the main problems in geotechnics, in particular for landslide characterization and modeling. The study deals with a new approach in geotechnical modeling which relays on a stochastic generation of different soil layers distributions, following a boolean logic – the method has been thus called BoSG (Boolean Stochastic Generation). In this way, it is possible to randomize the presence of a specific material interdigitated in a uniform matrix. In the building of a geotechnical model it is generally common to discard some stratigraphic data in order to simplify the model itself, assuming that the significance of the results of the modeling procedure would not be affected. With the proposed technique it is possible to quantify the error associated with this simplification. Moreover, it could be used to determine the most significant zones where eventual further investigations and surveys would be more effective to build the geotechnical model of the slope. The commercial software FLAC was used for the 2D and 3D geotechnical model. The distribution of the materials was randomized through a specifically coded MatLab program that automatically generates text files, each of them representing a specific soil configuration. Besides, a routine was designed to automate the computation of FLAC with the different data files in order to maximize the sample number. The methodology is applied with reference to a simplified slope in 2D, a simplified slope in 3D and an actual landslide, namely the Mortisa mudslide (Cortina d’Ampezzo, BL, Italy). However, it could be extended to numerous different cases, especially for hydrogeological analysis and landslide stability assessment, in different geological and geomorphological contexts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis was carried out inside the ESA's ESEO mission and focus in the design of one of the secondary payloads carried on board the spacecraft: a GNSS receiver for orbit determination. The purpose of this project is to test the technology of the orbit determination in real time applications by using commercial components. The architecture of the receiver includes a custom part, the navigation computer, and a commercial part, the front-end, from Novatel, with COCOM limitation removed, and a GNSS antenna. This choice is motivated by the goal of demonstrating the correct operations in orbit, enabling a widespread use of this technology while lowering the cost and time of the device’s assembly. The commercial front-end performs GNSS signal acquisition, tracking and data demodulation and provides raw GNSS data to the custom computer. This computer processes this raw observables, that will be both transferred to the On-Board Computer and then transmitted to Earth and provided as input to the recursive estimation filter on-board, in order to obtain an accurate positioning of the spacecraft, using the dynamic model. The main purpose of this thesis, is the detailed design and development of the mentioned GNSS receiver up to the ESEO project Critical Design Review, including requirements definition, hardware design and breadboard preliminary test phase design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The European Community has stressed the importance of achieving a common understanding to deal with the environmental noise through community actions of the Member States. This implies the use of harmonized indicators and specific information regarding the values of indicators, the exceedance of limits and the number of people and dwellings exposed to noise. The D.Lgs. 149/2005 in compliance with the European Directive 2002/49/EC defines methodologies, noise indicators and types of outputs required. In this dissertation the work done for the noise mapping of highly trafficked roads of the Province of Bologna will be reported. The study accounts for the environmental noise generated by the road infrastructure outside the urban agglomeration of Bologna. Roads characterized by an annual traffic greater than three millions of vehicles will be considered. The process of data collection and validation will be reported, as long as the implementation of the calculation method in the software and the procedure to create and calibrate the calculation model. Results will be provided as required by the legislation, in forms of maps and tables. Moreover results regarding each road section accounted will be combined to gain a general understanding of the situation of the overall studied area. Although the understanding of the noise levels and the number of people exposed is paramount, it is not sufficient to develop strategies of noise abatement interventions. Thus a further step will be addressed: the creation of priority maps as the basis of action plans for organizing and prioritizing solutions for noise reduction and abatement. Noise reduction measures are reported in a qualitative way in the annex and constitute a preliminary research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Air-sea interactions are a key process in the forcing of the ocean circulation and the climate. Water Mass Formation is a phenomenon related to extreme air-sea exchanges and heavy heat losses by the water column, being capable to transfer water properties from the surface to great depth and constituting a fundamental component of the thermohaline circulation of the ocean. Wind-driven Coastal Upwelling, on the other hand, is capable to induce intense heat gain in the water column, making this phenomenon important for climate change; further, it can have a noticeable influence on many biological pelagic ecosystems mechanisms. To study some of the fundamental characteristics of Water Mass Formation and Coastal Upwelling phenomena in the Mediterranean Sea, physical reanalysis obtained from the Mediterranean Forecating System model have been used for the period ranging from 1987 to 2012. The first chapter of this dissertation gives the basic description of the Mediterranean Sea circulation, the MFS model implementation, and the air-sea interaction physics. In the second chapter, the problem of Water Mass Formation in the Mediterranean Sea is approached, also performing ad-hoc numerical simulations to study heat balance components. The third chapter considers the study of Mediterranean Coastal Upwelling in some particular areas (Sicily, Gulf of Lion, Aegean Sea) of the Mediterranean Basin, together with the introduction of a new Upwelling Index to characterize and predict upwelling features using only surface estimates of air-sea fluxes. Our conclusions are that latent heat flux is the driving air-sea heat balance component in the Water Mass Formation phenomenon, while sensible heat exchanges are fundamental in Coastal Upwelling process. It is shown that our upwelling index is capable to reproduce the vertical velocity patterns in Coastal Upwelling areas. Nondimensional Marshall numbers evaluations for the open-ocean convection process in the Gulf of Lion show that it is a fully turbulent, three-dimensional phenomenon.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a geospatial model to predict the radiofrequency electromagnetic field from fixed site transmitters for use in epidemiological exposure assessment. The proposed model extends an existing model toward the prediction of indoor exposure, that is, at the homes of potential study participants. The model is based on accurate operation parameters of all stationary transmitters of mobile communication base stations, and radio broadcast and television transmitters for an extended urban and suburban region in the Basel area (Switzerland). The model was evaluated by calculating Spearman rank correlations and weighted Cohen's kappa (kappa) statistics between the model predictions and measurements obtained at street level, in the homes of volunteers, and in front of the windows of these homes. The correlation coefficients of the numerical predictions with street level measurements were 0.64, with indoor measurements 0.66, and with window measurements 0.67. The kappa coefficients were 0.48 (95%-confidence interval: 0.35-0.61) for street level measurements, 0.44 (95%-CI: 0.32-0.57) for indoor measurements, and 0.53 (95%-CI: 0.42-0.65) for window measurements. Although the modeling of shielding effects by walls and roofs requires considerable simplifications of a complex environment, we found a comparable accuracy of the model for indoor and outdoor points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-invasive excitability studies of motor axons in patients with amyotrophic lateral sclerosis (ALS) have revealed a changing pattern of abnormal membrane properties with disease progression, but the heterogeneity of the changes has made it difficult to relate them to pathophysiology. The SOD1(G93A) mouse model of ALS displays more synchronous motoneuron pathology. Multiple excitability measures of caudal and sciatic nerves in mutant and wild-type mice were compared before onset of signs and during disease progression (4-19 weeks), and they were related to changes in muscle fiber histochemistry. Excitability differences indicated a modest membrane depolarization in SOD1(G93A) axons at about the time of symptom onset (8 weeks), possibly due to deficient energy supply. Previously described excitability changes in ALS patients, suggesting altered sodium and potassium conductances, were not seen in the mice. This suggests that those changes relate to features of the human disease that are not well represented in the animal model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This abstract presents the biomechanical model that is used in the European ContraCancrum project, aiming at simulating tumor evolution in the brain and lung. The construction of the finite element model as well as a simulation of tumor growth are shown. The construction of the mesh is fully automatic and is therefore compatible with a clinical application. This biomechanical model will be later combined to a cellular level simulator also developed in the project.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The optical quality of the human eye mainly depends on the refractive performance of the cornea. The shape of the cornea is a mechanical balance between intraocular pressure and tissue intrinsic stiffness. Several surgical procedures in ophthalmology alter the biomechanics of the cornea to provoke local or global curvature changes for vision correction. Legitimated by the large number of surgical interventions performed every day, the demand for a deeper understanding of corneal biomechanics is rising to improve the safety of procedures and medical devices. The aim of our work is to propose a numerical model of corneal biomechanics, based on the stromal microstructure. Our novel anisotropic constitutive material law features a probabilistic weighting approach to model collagen fiber distribution as observed on human cornea by Xray scattering analysis (Aghamohammadzadeh et. al., Structure, February 2004). Furthermore, collagen cross-linking was explicitly included in the strain energy function. Results showed that the proposed model is able to successfully reproduce both inflation and extensiometry experimental data (Elsheikh et. al., Curr Eye Res, 2007; Elsheikh et. al., Exp Eye Res, May 2008). In addition, the mechanical properties calculated for patients of different age groups (Group A: 65-79 years; Group B: 80-95 years) demonstrate an increased collagen cross-linking, and a decrease in collagen fiber elasticity from younger to older specimen. These findings correspond to what is known about maturing fibrous biological tissue. Since the presented model can handle different loading situations and includes the anisotropic distribution of collagen fibers, it has the potential to simulate clinical procedures involving nonsymmetrical tissue interventions. In the future, such mechanical model can be used to improve surgical planning and the design of next generation ophthalmic devices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seventeen bones (sixteen cadaveric bones and one plastic bone) were used to validate a method for reconstructing a surface model of the proximal femur from 2D X-ray radiographs and a statistical shape model that was constructed from thirty training surface models. Unlike previously introduced validation studies, where surface-based distance errors were used to evaluate the reconstruction accuracy, here we propose to use errors measured based on clinically relevant morphometric parameters. For this purpose, a program was developed to robustly extract those morphometric parameters from the thirty training surface models (training population), from the seventeen surface models reconstructed from X-ray radiographs, and from the seventeen ground truth surface models obtained either by a CT-scan reconstruction method or by a laser-scan reconstruction method. A statistical analysis was then performed to classify the seventeen test bones into two categories: normal cases and outliers. This classification step depends on the measured parameters of the particular test bone. In case all parameters of a test bone were covered by the training population's parameter ranges, this bone is classified as normal bone, otherwise as outlier bone. Our experimental results showed that statistically there was no significant difference between the morphometric parameters extracted from the reconstructed surface models of the normal cases and those extracted from the reconstructed surface models of the outliers. Therefore, our statistical shape model based reconstruction technique can be used to reconstruct not only the surface model of a normal bone but also that of an outlier bone.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Osteoarticular allograft transplantation is a popular treatment method in wide surgical resections with large defects. For this reason hospitals are building bone data banks. Performing the optimal allograft selection on bone banks is crucial to the surgical outcome and patient recovery. However, current approaches are very time consuming hindering an efficient selection. We present an automatic method based on registration of femur bones to overcome this limitation. We introduce a new regularization term for the log-domain demons algorithm. This term replaces the standard Gaussian smoothing with a femur specific polyaffine model. The polyaffine femur model is constructed with two affine (femoral head and condyles) and one rigid (shaft) transformation. Our main contribution in this paper is to show that the demons algorithm can be improved in specific cases with an appropriate model. We are not trying to find the most optimal polyaffine model of the femur, but the simplest model with a minimal number of parameters. There is no need to optimize for different number of regions, boundaries and choice of weights, since this fine tuning will be done automatically by a final demons relaxation step with Gaussian smoothing. The newly developed synthesis approach provides a clear anatomically motivated modeling contribution through the specific three component transformation model, and clearly shows a performance improvement (in terms of anatomical meaningful correspondences) on 146 CT images of femurs compared to a standard multiresolution demons. In addition, this simple model improves the robustness of the demons while preserving its accuracy. The ground truth are manual measurements performed by medical experts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intrahepatic cholestasis of pregnancy may be complicated by fetal arrhythmia, fetal hypoxia, preterm labor, and, in severe cases, intrauterine death. The precise etiology of fetal death is not known. However, taurocholate has been demonstrated to cause arrhythmia and abnormal calcium dynamics in cardiomyocytes. To identify the underlying reason for increased susceptibility of fetal cardiomyocytes to arrhythmia, we studied myofibroblasts (MFBs), which appear during structural remodeling of the adult diseased heart. In vitro, they depolarize rat cardiomyocytes via heterocellular gap junctional coupling. Recently, it has been hypothesized that ventricular MFBs might appear in the developing human heart, triggered by physiological fetal hypoxia. However, their presence in the fetal heart (FH) and their proarrhythmogenic effects have not been systematically characterized. Immunohistochemistry demonstrated that ventricular MFBs transiently appear in the human FH during gestation. We established two in vitro models of the maternal heart (MH) and FH, both exposed to increasing doses of taurocholate. The MH model consisted of confluent strands of rat cardiomyocytes, whereas for the FH model, we added cardiac MFBs on top of cardiomyocytes. Taurocholate in the FH model, but not in the MH model, slowed conduction velocity from 19 to 9 cm/s, induced early after depolarizations, and resulted in sustained re-entrant arrhythmias. These arrhythmic events were prevented by ursodeoxycholic acid, which hyperpolarized MFB membrane potential by modulating potassium conductance. CONCLUSION: These results illustrate that the appearance of MFBs in the FH may contribute to arrhythmias. The above-described mechanism represents a new therapeutic approach for cardiac arrhythmias at the level of MFB.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Professor Sir David R. Cox (DRC) is widely acknowledged as among the most important scientists of the second half of the twentieth century. He inherited the mantle of statistical science from Pearson and Fisher, advanced their ideas, and translated statistical theory into practice so as to forever change the application of statistics in many fields, but especially biology and medicine. The logistic and proportional hazards models he substantially developed, are arguably among the most influential biostatistical methods in current practice. This paper looks forward over the period from DRC's 80th to 90th birthdays, to speculate about the future of biostatistics, drawing lessons from DRC's contributions along the way. We consider "Cox's model" of biostatistics, an approach to statistical science that: formulates scientific questions or quantities in terms of parameters gamma in probability models f(y; gamma) that represent in a parsimonious fashion, the underlying scientific mechanisms (Cox, 1997); partition the parameters gamma = theta, eta into a subset of interest theta and other "nuisance parameters" eta necessary to complete the probability distribution (Cox and Hinkley, 1974); develops methods of inference about the scientific quantities that depend as little as possible upon the nuisance parameters (Barndorff-Nielsen and Cox, 1989); and thinks critically about the appropriate conditional distribution on which to base infrences. We briefly review exciting biomedical and public health challenges that are capable of driving statistical developments in the next decade. We discuss the statistical models and model-based inferences central to the CM approach, contrasting them with computationally-intensive strategies for prediction and inference advocated by Breiman and others (e.g. Breiman, 2001) and to more traditional design-based methods of inference (Fisher, 1935). We discuss the hierarchical (multi-level) model as an example of the future challanges and opportunities for model-based inference. We then consider the role of conditional inference, a second key element of the CM. Recent examples from genetics are used to illustrate these ideas. Finally, the paper examines causal inference and statistical computing, two other topics we believe will be central to biostatistics research and practice in the coming decade. Throughout the paper, we attempt to indicate how DRC's work and the "Cox Model" have set a standard of excellence to which all can aspire in the future.