749 resultados para Consortial Implementations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis I present a new coarse-grained model suitable to investigate the phase behavior of rod-coil block copolymers on mesoscopic length scales. In this model the rods are represented by hard spherocylinders, whereas the coil block consists of interconnected beads. The interactions between the constituents are based on local densities. This facilitates an efficient Monte-Carlo sampling of the phase space. I verify the applicability of the model and the simulation approach by means of several examples. I treat pure rod systems and mixtures of rod and coil polymers. Then I append coils to the rods and investigate the role of the different model parameters. Furthermore, I compare different implementations of the model. I prove the capability of the rod-coil block copolymers in our model to exhibit typical micro-phase separated configurations as well as extraordinary phases, such as the wavy lamellar state, percolating structuresrnand clusters. Additionally, I demonstrate the metastability of the observed zigzag phase in our model. A central point of this thesis is the examination of the phase behavior of the rod-coil block copolymers in dependence of different chain lengths and interaction strengths between rods and coil. The observations of these studies are summarized in a phase diagram for rod-coil block copolymers. Furthermore, I validate a stabilization of the smectic phase with increasing coil fraction.rnIn the second part of this work I present a side project in which I derive a model permitting the simulation of tetrapods with and without grafted semiconducting block copolymers. The effect of these polymers is added in an implicit manner by effective interactions between the tetrapods. While the depletion interaction is described in an approximate manner within the Asakura-Oosawa model, the free energy penalty for the brush compression is calculated within the Alexander-de Gennes model. Recent experiments with CdSe tetrapods show that grafted tetrapods are clearly much better dispersed in the polymer matrix than bare tetrapods. My simulations confirm that bare tetrapods tend to aggregate in the matrix of excess polymers, while clustering is significantly reduced after grafting polymer chains to the tetrapods. Finally, I propose a possible extension enabling the simulation of a system with fluctuating volume and demonstrate its basic functionality. This study is originated in a cooperation with an experimental group with the goal to analyze the morphology of these systems in order to find the ideal morphology for hybrid solar cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Topographically organized neurons represent multiple stimuli within complex visual scenes and compete for subsequent processing in higher visual centers. The underlying neural mechanisms of this process have long been elusive. We investigate an experimentally constrained model of a midbrain structure: the optic tectum and the reciprocally connected nucleus isthmi. We show that a recurrent antitopographic inhibition mediates the competitive stimulus selection between distant sensory inputs in this visual pathway. This recurrent antitopographic inhibition is fundamentally different from surround inhibition in that it projects on all locations of its input layer, except to the locus from which it receives input. At a larger scale, the model shows how a focal top-down input from a forebrain region, the arcopallial gaze field, biases the competitive stimulus selection via the combined activation of a local excitation and the recurrent antitopographic inhibition. Our findings reveal circuit mechanisms of competitive stimulus selection and should motivate a search for anatomical implementations of these mechanisms in a range of vertebrate attentional systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An alternative way is provided to define the discrete Pascal transform using difference operators to reveal the fundamental concept of the transform, in both one- and two-dimensional cases, which is extended to cover non-square two-dimensional applications. Efficient modularised implementations are proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cell therapies have gained increasing interest and developed in several approaches related to the treatment of damaged myocardium. The results of multiple clinical trials have already been reported, almost exclusively involving the direct injection of stem cells. It has, however, been postulated that the efficiency of injected cells could possibly be hindered by the mechanical trauma due to the injection and their low survival in the hostile environment. It has indeed been demonstrated that cell mortality due to the injection approaches 90%. Major issues still need to be resolved and bed-to-bench followup is paramount to foster clinical implementations. The tissue engineering approach thus constitutes an attractive alternative since it provides the opportunity to deliver a large number of cells that are already organized in an extracellular matrix. Recent laboratory reports confirmed the interest of this approach and already encouraged a few groups to investigate it in clinical studies. We discuss current knowledge regarding engineered tissue for myocardial repair or replacement and in particular the recent implementation of nanotechnological approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several practical obstacles in data handling and evaluation complicate the use of quantitative localized magnetic resonance spectroscopy (qMRS) in clinical routine MR examinations. To overcome these obstacles, a clinically feasible MR pulse sequence protocol based on standard available MR pulse sequences for qMRS has been implemented along with newly added functionalities to the free software package jMRUI-v5.0 to make qMRS attractive for clinical routine. This enables (a) easy and fast DICOM data transfer from the MR console and the qMRS-computer, (b) visualization of combined MR spectroscopy and imaging, (c) creation and network transfer of spectroscopy reports in DICOM format, (d) integration of advanced water reference models for absolute quantification, and (e) setup of databases containing normal metabolite concentrations of healthy subjects. To demonstrate the work-flow of qMRS using these implementations, databases for normal metabolite concentration in different regions of brain tissue were created using spectroscopic data acquired in 55 normal subjects (age range 6-61 years) using 1.5T and 3T MR systems, and illustrated in one clinical case of typical brain tumor (primitive neuroectodermal tumor). The MR pulse sequence protocol and newly implemented software functionalities facilitate the incorporation of qMRS and reference to normal value metabolite concentration data in daily clinical routine. Magn Reson Med, 2013. © 2012 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traits are fine-grained components that can be used to compose classes, while avoiding many of the problems of multiple inheritance and mixin-based approaches. Since most implementations of traits have focused on dynamically-typed languages, the question naturally arises, how can one best introduce traits to statically-typed languages, like Java and C#?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Latent class regression models are useful tools for assessing associations between covariates and latent variables. However, evaluation of key model assumptions cannot be performed using methods from standard regression models due to the unobserved nature of latent outcome variables. This paper presents graphical diagnostic tools to evaluate whether or not latent class regression models adhere to standard assumptions of the model: conditional independence and non-differential measurement. An integral part of these methods is the use of a Markov Chain Monte Carlo estimation procedure. Unlike standard maximum likelihood implementations for latent class regression model estimation, the MCMC approach allows us to calculate posterior distributions and point estimates of any functions of parameters. It is this convenience that allows us to provide the diagnostic methods that we introduce. As a motivating example we present an analysis focusing on the association between depression and socioeconomic status, using data from the Epidemiologic Catchment Area study. We consider a latent class regression analysis investigating the association between depression and socioeconomic status measures, where the latent variable depression is regressed on education and income indicators, in addition to age, gender, and marital status variables. While the fitted latent class regression model yields interesting results, the model parameters are found to be invalid due to the violation of model assumptions. The violation of these assumptions is clearly identified by the presented diagnostic plots. These methods can be applied to standard latent class and latent class regression models, and the general principle can be extended to evaluate model assumptions in other types of models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop fast fitting methods for generalized functional linear models. An undersmooth of the functional predictor is obtained by projecting on a large number of smooth eigenvectors and the coefficient function is estimated using penalized spline regression. Our method can be applied to many functional data designs including functions measured with and without error, sparsely or densely sampled. The methods also extend to the case of multiple functional predictors or functional predictors with a natural multilevel structure. Our approach can be implemented using standard mixed effects software and is computationally fast. Our methodology is motivated by a diffusion tensor imaging (DTI) study. The aim of this study is to analyze differences between various cerebral white matter tract property measurements of multiple sclerosis (MS) patients and controls. While the statistical developments proposed here were motivated by the DTI study, the methodology is designed and presented in generality and is applicable to many other areas of scientific research. An online appendix provides R implementations of all simulations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Permutation tests are useful for drawing inferences from imaging data because of their flexibility and ability to capture features of the brain that are difficult to capture parametrically. However, most implementations of permutation tests ignore important confounding covariates. To employ covariate control in a nonparametric setting we have developed a Markov chain Monte Carlo (MCMC) algorithm for conditional permutation testing using propensity scores. We present the first use of this methodology for imaging data. Our MCMC algorithm is an extension of algorithms developed to approximate exact conditional probabilities in contingency tables, logit, and log-linear models. An application of our non-parametric method to remove potential bias due to the observed covariates is presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multi-input multi-output (MIMO) technology is an emerging solution for high data rate wireless communications. We develop soft-decision based equalization techniques for frequency selective MIMO channels in the quest for low-complexity equalizers with BER performance competitive to that of ML sequence detection. We first propose soft decision equalization (SDE), and demonstrate that decision feedback equalization (DFE) based on soft-decisions, expressed via the posterior probabilities associated with feedback symbols, is able to outperform hard-decision DFE, with a low computational cost that is polynomial in the number of symbols to be recovered, and linear in the signal constellation size. Building upon the probabilistic data association (PDA) multiuser detector, we present two new MIMO equalization solutions to handle the distinctive channel memory. With their low complexity, simple implementations, and impressive near-optimum performance offered by iterative soft-decision processing, the proposed SDE methods are attractive candidates to deliver efficient reception solutions to practical high-capacity MIMO systems. Motivated by the need for low-complexity receiver processing, we further present an alternative low-complexity soft-decision equalization approach for frequency selective MIMO communication systems. With the help of iterative processing, two detection and estimation schemes based on second-order statistics are harmoniously put together to yield a two-part receiver structure: local multiuser detection (MUD) using soft-decision Probabilistic Data Association (PDA) detection, and dynamic noise-interference tracking using Kalman filtering. The proposed Kalman-PDA detector performs local MUD within a sub-block of the received data instead of over the entire data set, to reduce the computational load. At the same time, all the inter-ference affecting the local sub-block, including both multiple access and inter-symbol interference, is properly modeled as the state vector of a linear system, and dynamically tracked by Kalman filtering. Two types of Kalman filters are designed, both of which are able to track an finite impulse response (FIR) MIMO channel of any memory length. The overall algorithms enjoy low complexity that is only polynomial in the number of information-bearing bits to be detected, regardless of the data block size. Furthermore, we introduce two optional performance-enhancing techniques: cross- layer automatic repeat request (ARQ) for uncoded systems and code-aided method for coded systems. We take Kalman-PDA as an example, and show via simulations that both techniques can render error performance that is better than Kalman-PDA alone and competitive to sphere decoding. At last, we consider the case that channel state information (CSI) is not perfectly known to the receiver, and present an iterative channel estimation algorithm. Simulations show that the performance of SDE with channel estimation approaches that of SDE with perfect CSI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Simulations of forest stand dynamics in a modelling framework including Forest Vegetation Simulator (FVS) are diameter driven, thus the diameter or basal area increment model needs a special attention. This dissertation critically evaluates diameter or basal area increment models and modelling approaches in the context of the Great Lakes region of the United States and Canada. A set of related studies are presented that critically evaluate the sub-model for change in individual tree basal diameter used in the Forest Vegetation Simulator (FVS), a dominant forestry model in the Great Lakes region. Various historical implementations of the STEMS (Stand and Tree Evaluation and Modeling System) family of diameter increment models, including the current public release of the Lake States variant of FVS (LS-FVS), were tested for the 30 most common tree species using data from the Michigan Forest Inventory and Analysis (FIA) program. The results showed that current public release of the LS-FVS diameter increment model over-predicts 10-year diameter increment by 17% on average. Also the study affirms that a simple adjustment factor as a function of a single predictor, dbh (diameter at breast height) used in the past versions, provides an inadequate correction of model prediction bias. In order to re-engineer the basal diameter increment model, the historical, conceptual and philosophical differences among the individual tree increment model families and their modelling approaches were analyzed and discussed. Two underlying conceptual approaches toward diameter or basal area increment modelling have been often used: the potential-modifier (POTMOD) and composite (COMP) approaches, which are exemplified by the STEMS/TWIGS and Prognosis models, respectively. It is argued that both approaches essentially use a similar base function and neither is conceptually different from a biological perspective, even though they look different in their model forms. No matter what modelling approach is used, the base function is the foundation of an increment model. Two base functions – gamma and Box-Lucas – were identified as candidate base functions for forestry applications. The results of a comparative analysis of empirical fits showed that quality of fit is essentially similar, and both are sufficiently detailed and flexible for forestry applications. The choice of either base function in order to model diameter or basal area increment is dependent upon personal preference; however, the gamma base function may be preferred over the Box-Lucas, as it fits the periodic increment data in both a linear and nonlinear composite model form. Finally, the utility of site index as a predictor variable has been criticized, as it has been widely used in models for complex, mixed species forest stands though not well suited for this purpose. An alternative to site index in an increment model was explored, using site index and a combination of climate variables and Forest Ecosystem Classification (FEC) ecosites and data from the Province of Ontario, Canada. The results showed that a combination of climate and FEC ecosites variables can replace site index in the diameter increment model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bluetooth wireless technology is a robust short-range communications system designed for low power (10 meter range) and low cost. It operates in the 2.4 GHz Industrial Scientific Medical (ISM) band and it employs two techniques for minimizing interference: a frequency hopping scheme which nominally splits the 2.400 - 2.485 GHz band in 79 frequency channels and a time division duplex (TDD) scheme which is used to switch to a new frequency channel on 625 μs boundaries. During normal operation a Bluetooth device will be active on a different frequency channel every 625 μs, thus minimizing the chances of continuous interference impacting the performance of the system. The smallest unit of a Bluetooth network is called a piconet, and can have a maximum of eight nodes. Bluetooth devices must assume one of two roles within a piconet, master or slave, where the master governs quality of service and the frequency hopping schedule within the piconet and the slave follows the master’s schedule. A piconet must have a single master and up to 7 active slaves. By allowing devices to have roles in multiple piconets through time multiplexing, i.e. slave/slave or master/slave, the Bluetooth technology allows for interconnecting multiple piconets into larger networks called scatternets. The Bluetooth technology is explored in the context of enabling ad-hoc networks. The Bluetooth specification provides flexibility in the scatternet formation protocol, outlining only the mechanisms necessary for future protocol implementations. A new protocol for scatternet formation and maintenance - mscat - is presented and its performance is evaluated using a Bluetooth simulator. The free variables manipulated in this study include device activity and the probabilities of devices performing discovery procedures. The relationship between the role a device has in the scatternet and it’s probability of performing discovery was examined and related to the scatternet topology formed. The results show that mscat creates dense network topologies for networks of 30, 50 and 70 nodes. The mscat protocol results in approximately a 33% increase in slaves/piconet and a reduction of approximately 12.5% of average roles/node. For 50 node scenarios the set of parameters which creates the best determined outcome is unconnected node inquiry probability (UP) = 10%, master node inquiry probability (MP) = 80% and slave inquiry probability (SP) = 40%. The mscat protocol extends the Bluetooth specification for formation and maintenance of scatternets in an ad-hoc network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As one of the largest and most complex organizations in the world, the Department of Defense (DoD) faces many challenges in solving its well-documented financial and related business operations and system problems. The DoD is in the process of implementing modern multifunction enterprise resource planning (ERP) systems to replace many of its outdated legacy systems. This paper explores the ERP implementations of the DoD and seeks to determine the impact of the ERP implementations on the alignment of the DoD’s business and IT strategy. A brief overview of the alignment literature and background on ERP are followed by case study analysis of the DoD ERP development and current implementation status. Lastly, the paper explores the current successes and failures of the ERP implementation and the impact on the DoD’s goal of strategic alignment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tomorrow's eternal software system will co-evolve with their context: their metamodels must adapt at runtime to ever-changing external requirements. In this paper we present FAME, a polyglot library that keeps metamodels accessible and adaptable at runtime. Special care is taken to establish causal connection between fame-classes and host-classes. As some host-languages offer limited reflection features only, not all implementations feature the same degree of causal connection. We present and discuss three scenarios: 1) full causal connection, 2) no causal connection, and 3) emulated causal connection. Of which, both Scenario 1 and 3 are suitable to deploy fully metamodel-driven applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Code executed in a fully reflective system switches back and forth between application and interpreter code. These two states can be seen as contexts in which an expression is evaluated. Current language implementations obtain reflective capabilities by exposing objects to the interpreter. However, in doing so these systems break the encapsulation of the application objects. In this paper we propose safe reflection through polymorphism, \ie by unifying the interface and ensuring the encapsulation of objects from both the interpreter and application context. We demonstrate a \emphhomogeneous system that defines the execution semantics in terms of itself, thus enforcing that encapsulation is not broken.