836 resultados para Framework Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ein neu entwickeltes globales Atmosphärenchemie- und Zirkulationsmodell (ECHAM5/MESSy1) wurde verwendet um die Chemie und den Transport von Ozonvorläufersubstanzen zu untersuchen, mit dem Schwerpunkt auf Nichtmethankohlenwasserstoffen. Zu diesem Zweck wurde das Modell durch den Vergleich der Ergebnisse mit Messungen verschiedenen Ursprungs umfangreich evaluiert. Die Analyse zeigt, daß das Modell die Verteilung von Ozon realistisch vorhersagt, und zwar sowohl die Menge als auch den Jahresgang. An der Tropopause gibt das Modell den Austausch zwischen Stratosphäre und Troposphäre ohne vorgeschriebene Flüsse oder Konzentrationen richtig wieder. Das Modell simuliert die Ozonvorläufersubstanzen mit verschiedener Qualität im Vergleich zu den Messungen. Obwohl die Alkane vom Modell gut wiedergeben werden, ergibt sich einige Abweichungen für die Alkene. Von den oxidierten Substanzen wird Formaldehyd (HCHO) richtig wiedergegeben, während die Korrelationen zwischen Beobachtungen und Modellergebnissen für Methanol (CH3OH) und Aceton (CH3COCH3) weitaus schlechter ausfallen. Um die Qualität des Modells im Bezug auf oxidierte Substanzen zu verbessern, wurden einige Sensitivitätsstudien durchgeführt. Diese Substanzen werden durch Emissionen/Deposition von/in den Ozean beeinflußt, und die Kenntnis über den Gasaustausch mit dem Ozean ist mit großen Unsicherheiten behaftet. Um die Ergebnisse des Modells ECHAM5/MESSy1 zu verbessern wurde das neue Submodell AIRSEA entwickelt und in die MESSy-Struktur integriert. Dieses Submodell berücksichtigt den Gasaustausch zwischen Ozean und Atmosphäre einschließlich der oxidierten Substanzen. AIRSEA, welches Informationen über die Flüssigphasenkonzentration des Gases im Oberflächenwasser des Ozeans benötigt wurde ausgiebig getestet. Die Anwendung des neuen Submodells verbessert geringfügig die Modellergebnisse für Aceton und Methanol, obwohl die Verwendung einer vorgeschriebenen Flüssigphasenkonzentration stark den Erfolg der Methode einschränkt, da Meßergebnisse nicht in ausreichendem Maße zu Verfügung stehen. Diese Arbeit vermittelt neue Einsichten über organische Substanzen. Sie stellt die Wichtigkeit der Kopplung zwischen Ozean und Atmosphäre für die Budgets vieler Gase heraus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is mainly concerned with a model calculation for generalized parton distributions (GPDs). We calculate vectorial- and axial GPDs for the N N and N Delta transition in the framework of a light front quark model. This requires the elaboration of a connection between transition amplitudes and GPDs. We provide the first quark model calculations for N Delta GPDs. The examination of transition amplitudes leads to various model independent consistency relations. These relations are not exactly obeyed by our model calculation since the use of the impulse approximation in the light front quark model leads to a violation of Poincare covariance. We explore the impact of this covariance breaking on the GPDs and form factors which we determine in our model calculation and find large effects. The reference frame dependence of our results which originates from the breaking of Poincare covariance can be eliminated by introducing spurious covariants. We extend this formalism in order to obtain frame independent results from our transition amplitudes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of a multibody model of a motorbike engine cranktrain is presented in this work, with an emphasis on flexible component model reduction. A modelling methodology based upon the adoption of non-ideal joints at interface locations, and the inclusion of component flexibility, is developed: both are necessary tasks if one wants to capture dynamic effects which arise in lightweight, high-speed applications. With regard to the first topic, both a ball bearing model and a journal bearing model are implemented, in order to properly capture the dynamic effects of the main connections in the system: angular contact ball bearings are modelled according to a five-DOF nonlinear scheme in order to grasp the crankshaft main bearings behaviour, while an impedance-based hydrodynamic bearing model is implemented providing an enhanced operation prediction at the conrod big end locations. Concerning the second matter, flexible models of the crankshaft and the connecting rod are produced. The well-established Craig-Bampton reduction technique is adopted as a general framework to obtain reduced model representations which are suitable for the subsequent multibody analyses. A particular component mode selection procedure is implemented, based on the concept of Effective Interface Mass, allowing an assessment of the accuracy of the reduced models prior to the nonlinear simulation phase. In addition, a procedure to alleviate the effects of modal truncation, based on the Modal Truncation Augmentation approach, is developed. In order to assess the performances of the proposed modal reduction schemes, numerical tests are performed onto the crankshaft and the conrod models in both frequency and modal domains. A multibody model of the cranktrain is eventually assembled and simulated using a commercial software. Numerical results are presented, demonstrating the effectiveness of the implemented flexible model reduction techniques. The advantages over the conventional frequency-based truncation approach are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research aims at developing a framework for semantic-based digital survey of architectural heritage. Rooted in knowledge-based modeling which extracts mathematical constraints of geometry from architectural treatises, as-built information of architecture obtained from image-based modeling is integrated with the ideal model in BIM platform. The knowledge-based modeling transforms the geometry and parametric relation of architectural components from 2D printings to 3D digital models, and create large amount variations based on shape grammar in real time thanks to parametric modeling. It also provides prior knowledge for semantically segmenting unorganized survey data. The emergence of SfM (Structure from Motion) provides access to reconstruct large complex architectural scenes with high flexibility, low cost and full automation, but low reliability of metric accuracy. We solve this problem by combing photogrammetric approaches which consists of camera configuration, image enhancement, and bundle adjustment, etc. Experiments show the accuracy of image-based modeling following our workflow is comparable to that from range-based modeling. We also demonstrate positive results of our optimized approach in digital reconstruction of portico where low-texture-vault and dramatical transition of illumination bring huge difficulties in the workflow without optimization. Once the as-built model is obtained, it is integrated with the ideal model in BIM platform which allows multiple data enrichment. In spite of its promising prospect in AEC industry, BIM is developed with limited consideration of reverse-engineering from survey data. Besides representing the architectural heritage in parallel ways (ideal model and as-built model) and comparing their difference, we concern how to create as-built model in BIM software which is still an open area to be addressed. The research is supposed to be fundamental for research of architectural history, documentation and conservation of architectural heritage, and renovation of existing buildings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis focuses on the design and characterization of a novel, artificial minimal model membrane system with chosen physical parameters to mimic a nanoparticle uptake process driven exclusively by adhesion and softness of the bilayer. The realization is based on polymersomes composed of poly(dimethylsiloxane)-b-poly(2-methyloxazoline) (PMDS-b-PMOXA) and nanoscopic colloidal particles (polystyrene, silica), and the utilization of powerful characterization techniques. rnPDMS-b-PMOXA polymersomes with a radius, Rh ~100 nm, a size polydispersity, PD = 1.1 and a membrane thickness, h = 16 nm, were prepared using the film rehydratation method. Due to the suitable mechanical properties (Young’s modulus of ~17 MPa and a bending modulus of ~7⋅10-8 J) along with the long-term stability and the modifiability, these kind of polymersomes can be used as model membranes to study physical and physicochemical aspects of transmembrane transport of nanoparticles. A combination of photon (PCS) and fluorescence (FCS) correlation spectroscopies optimizes species selectivity, necessary for a unique internalization study encompassing two main efforts. rnFor the proof of concepts, the first effort focused on the interaction of nanoparticles (Rh NP SiO2 = 14 nm, Rh NP PS = 16 nm; cNP = 0.1 gL-1) and polymersomes (Rh P = 112 nm; cP = 0.045 gL-1) with fixed size and concentration. Identification of a modified form factor of the polymersome entities, selectively seen in the PCS experiment, enabled a precise monitor and quantitative description of the incorporation process. Combining PCS and FCS led to the estimation of the incorporated particles per polymersome (about 8 in the examined system) and the development of an appropriate methodology for the kinetics and dynamics of the internalization process. rnThe second effort aimed at the establishment of the necessary phenomenology to facilitate comparison with theories. The size and concentration of the nanoparticles were chosen as the most important system variables (Rh NP = 14 - 57 nm; cNP = 0.05 - 0.2 gL-1). It was revealed that the incorporation process could be controlled to a significant extent by changing the nanoparticles size and concentration. Average number of 7 up to 11 NPs with Rh NP = 14 nm and 3 up to 6 NPs with Rh NP = 25 nm can be internalized into the present polymersomes by changing initial nanoparticles concentration in the range 0.1- 0.2 gL-1. Rapid internalization of the particles by polymersomes is observed only above a critical threshold particles concentration, dependent on the nanoparticle size. rnWith regard possible pathways for the particle uptake, cryogenic transmission electron microscopy (cryo-TEM) has revealed two different incorporation mechanisms depending on the size of the involved nanoparticles: cooperative incorporation of nanoparticles groups or single nanoparticles incorporation. Conditions for nanoparticle uptake and controlled filling of polymersomes were presented. rnIn the framework of this thesis, the experimental observation of transmembrane transport of spherical PS and SiO2 NPs into polymersomes via an internalization process was reported and examined quantitatively for the first time. rnIn a summary the work performed in frames of this thesis might have significant impact on cell model systems’ development and thus improved understanding of transmembrane transport processes. The present experimental findings help create the missing phenomenology necessary for a detailed understanding of a phenomenon with great relevance in transmembrane transport. The fact that transmembrane transport of nanoparticles can be performed by artificial model system without any additional stimuli has a fundamental impact on the understanding, not only of the nanoparticle invagination process but also of the interaction of nanoparticles with biological as well as polymeric membranes. rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a kernel density correlation based nonrigid point set matching method and shows its application in statistical model based 2D/3D reconstruction of a scaled, patient-specific model from an un-calibrated x-ray radiograph. In this method, both the reference point set and the floating point set are first represented using kernel density estimates. A correlation measure between these two kernel density estimates is then optimized to find a displacement field such that the floating point set is moved to the reference point set. Regularizations based on the overall deformation energy and the motion smoothness energy are used to constraint the displacement field for a robust point set matching. Incorporating this non-rigid point set matching method into a statistical model based 2D/3D reconstruction framework, we can reconstruct a scaled, patient-specific model from noisy edge points that are extracted directly from the x-ray radiograph by an edge detector. Our experiment conducted on datasets of two patients and six cadavers demonstrates a mean reconstruction error of 1.9 mm

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synaptic strength depresses for low and potentiates for high activation of the postsynaptic neuron. This feature is a key property of the Bienenstock–Cooper–Munro (BCM) synaptic learning rule, which has been shown to maximize the selectivity of the postsynaptic neuron, and thereby offers a possible explanation for experience-dependent cortical plasticity such as orientation selectivity. However, the BCM framework is rate-based and a significant amount of recent work has shown that synaptic plasticity also depends on the precise timing of presynaptic and postsynaptic spikes. Here we consider a triplet model of spike-timing–dependent plasticity (STDP) that depends on the interactions of three precisely timed spikes. Triplet STDP has been shown to describe plasticity experiments that the classical STDP rule, based on pairs of spikes, has failed to capture. In the case of rate-based patterns, we show a tight correspondence between the triplet STDP rule and the BCM rule. We analytically demonstrate the selectivity property of the triplet STDP rule for orthogonal inputs and perform numerical simulations for nonorthogonal inputs. Moreover, in contrast to BCM, we show that triplet STDP can also induce selectivity for input patterns consisting of higher-order spatiotemporal correlations, which exist in natural stimuli and have been measured in the brain. We show that this sensitivity to higher-order correlations can be used to develop direction and speed selectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Simulation Automation Framework for Experiments (SAFE) is a project created to raise the level of abstraction in network simulation tools and thereby address issues that undermine credibility. SAFE incorporates best practices in network simulationto automate the experimental process and to guide users in the development of sound scientific studies using the popular ns-3 network simulator. My contributions to the SAFE project: the design of two XML-based languages called NEDL (ns-3 Experiment Description Language) and NSTL (ns-3 Script Templating Language), which facilitate the description of experiments and network simulationmodels, respectively. The languages provide a foundation for the construction of better interfaces between the user and the ns-3 simulator. They also provide input to a mechanism which automates the execution of network simulation experiments. Additionally,this thesis demonstrates that one can develop tools to generate ns-3 scripts in Python or C++ automatically from NSTL model descriptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stimulation of human epileptic tissue can induce rhythmic, self-terminating responses on the EEG or ECoG. These responses play a potentially important role in localising tissue involved in the generation of seizure activity, yet the underlying mechanisms are unknown. However, in vitro evidence suggests that self-terminating oscillations in nervous tissue are underpinned by non-trivial spatio-temporal dynamics in an excitable medium. In this study, we investigate this hypothesis in spatial extensions to a neural mass model for epileptiform dynamics. We demonstrate that spatial extensions to this model in one and two dimensions display propagating travelling waves but also more complex transient dynamics in response to local perturbations. The neural mass formulation with local excitatory and inhibitory circuits, allows the direct incorporation of spatially distributed, functional heterogeneities into the model. We show that such heterogeneities can lead to prolonged reverberating responses to a single pulse perturbation, depending upon the location at which the stimulus is delivered. This leads to the hypothesis that prolonged rhythmic responses to local stimulation in epileptogenic tissue result from repeated self-excitation of regions of tissue with diminished inhibitory capabilities. Combined with previous models of the dynamics of focal seizures this macroscopic framework is a first step towards an explicit spatial formulation of the concept of the epileptogenic zone. Ultimately, an improved understanding of the pathophysiologic mechanisms of the epileptogenic zone will help to improve diagnostic and therapeutic measures for treating epilepsy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The group analysed some syntactic and phonological phenomena that presuppose the existence of interrelated components within the lexicon, which motivate the assumption that there are some sublexicons within the global lexicon of a speaker. This result is confirmed by experimental findings in neurolinguistics. Hungarian speaking agrammatic aphasics were tested in several ways, the results showing that the sublexicon of closed-class lexical items provides a highly automated complex device for processing surface sentence structure. Analysing Hungarian ellipsis data from a semantic-syntactic aspect, the group established that the lexicon is best conceived of being as split into at least two main sublexicons: the store of semantic-syntactic feature bundles and a separate store of sound forms. On this basis they proposed a format for representing open-class lexical items whose meanings are connected via certain semantic relations. They also proposed a new classification of verbs to account for the contribution of the aspectual reading of the sentence depending on the referential type of the argument, and a new account of the syntactic and semantic behaviour of aspectual prefixes. The partitioned sets of lexical items are sublexicons on phonological grounds. These sublexicons differ in terms of phonotactic grammaticality. The degrees of phonotactic grammaticality are tied up with the problem of psychological reality, of how many degrees of this native speakers are sensitive to. The group developed a hierarchical construction network as an extension of the original General Inheritance Network formalism and this framework was then used as a platform for the implementation of the grammar fragments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Drugs are routinely combined in anesthesia and pain management to obtain an enhancement of the desired effects. However, a parallel enhancement of the undesired effects might take place as well, resulting in a limited therapeutic usefulness. Therefore, when addressing the question of optimal drug combinations, side effects must be taken into account. METHODS: By extension of a previously published interaction model, the authors propose a method to study drug interactions considering also their side effects. A general outcome parameter identified as patient's well-being is defined by superposition of positive and negative effects. Well-being response surfaces are computed and analyzed for varying drugs pharmacodynamics and interaction types. In particular, the existence of multiple maxima and of optimal drug combinations is investigated for the combination of two drugs. RESULTS: Both drug pharmacodynamics and interaction type affect the well-being surface and the deriving optimal combinations. The effect of the interaction parameters can be explained in terms of synergy and antagonism and remains unchanged for varying pharmacodynamics. For all simulations performed for the combination of two drugs, the presence of more than one maximum was never observed. CONCLUSIONS: The model is consistent with clinical knowledge and supports previously published experimental results on optimal drug combinations. This new framework improves understanding of the characteristics of drug combinations used in clinical practice and can be used in clinical research to identify optimal drug dosing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data visualization is the process of representing data as pictures to support reasoning about the underlying data. For the interpretation to be as easy as possible, we need to be as close as possible to the original data. As most visualization tools have an internal meta-model, which is different from the one for the presented data, they usually need to duplicate the original data to conform to their meta-model. This leads to an increase in the resources needed, increase which is not always justified. In this work we argue for the need of having an engine that is as close as possible to the data and we present our solution of moving the visualization tool to the data, instead of moving the data to the visualization tool. Our solution also emphasizes the necessity of reusing basic blocks to express complex visualizations and allowing the programmer to script the visualization using his preferred tools, rather than a third party format. As a validation of the expressiveness of our framework, we show how we express several already published visualizations and describe the pros and cons of the approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a framework for statistical finite element analysis combining shape and material properties, and allowing performing statistical statements of biomechanical performance across a given population. In this paper, we focus on the design of orthopaedic implants that fit a maximum percentage of the target population, both in terms of geometry and biomechanical stability. CT scans of the bone under consideration are registered non-rigidly to obtain correspondences in position and intensity between them. A statistical model of shape and intensity (bone density) is computed by means of principal component analysis. Afterwards, finite element analysis (FEA) is performed to analyse the biomechanical performance of the bones. Realistic forces are applied on the bones and the resulting displacement and bone stress distribution are calculated. The mechanical behaviour of different PCA bone instances is compared.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the widespread popularity of linear models for correlated outcomes (e.g. linear mixed models and time series models), distribution diagnostic methodology remains relatively underdeveloped in this context. In this paper we present an easy-to-implement approach that lends itself to graphical displays of model fit. Our approach involves multiplying the estimated margional residual vector by the Cholesky decomposition of the inverse of the estimated margional variance matrix. The resulting "rotated" residuals are used to construct an empirical cumulative distribution function and pointwise standard errors. The theoretical framework, including conditions and asymptotic properties, involves technical details that are motivated by Lange and Ryan (1989), Pierce (1982), and Randles (1982). Our method appears to work well in a variety of circumstances, including models having independent units of sampling (clustered data) and models for which all observations are correlated (e.g., a single time series). Our methods can produce satisfactory results even for models that do not satisfy all of the technical conditions stated in our theory.