981 resultados para Eclipse modeling framework (EMF)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a new sparse shape modeling framework on the Laplace-Beltrami (LB) eigenfunctions. Traditionally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes by forming a Fourier series expansion. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we propose to filter out only the significant eigenfunctions by imposing l1-penalty. The new sparse framework can further avoid additional surface-based smoothing often used in the field. The proposed approach is applied in investigating the influence of age (38-79 years) and gender on amygdala and hippocampus shapes in the normal population. In addition, we show how the emotional response is related to the anatomy of the subcortical structures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is well known that there is a dynamic relationship between cerebral blood flow (CBF) and cerebral blood volume (CBV). With increasing applications of functional MRI, where the blood oxygen-level-dependent signals are recorded, the understanding and accurate modeling of the hemodynamic relationship between CBF and CBV becomes increasingly important. This study presents an empirical and data-based modeling framework for model identification from CBF and CBV experimental data. It is shown that the relationship between the changes in CBF and CBV can be described using a parsimonious autoregressive with exogenous input model structure. It is observed that neither the ordinary least-squares (LS) method nor the classical total least-squares (TLS) method can produce accurate estimates from the original noisy CBF and CBV data. A regularized total least-squares (RTLS) method is thus introduced and extended to solve such an error-in-the-variables problem. Quantitative results show that the RTLS method works very well on the noisy CBF and CBV data. Finally, a combination of RTLS with a filtering method can lead to a parsimonious but very effective model that can characterize the relationship between the changes in CBF and CBV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing distributed hydrologic models are complex and computationally demanding for using as a rapid-forecasting policy-decision tool, or even as a class-room educational tool. In addition, platform dependence, specific input/output data structures and non-dynamic data-interaction with pluggable software components inside the existing proprietary frameworks make these models restrictive only to the specialized user groups. RWater is a web-based hydrologic analysis and modeling framework that utilizes the commonly used R software within the HUBzero cyber infrastructure of Purdue University. RWater is designed as an integrated framework for distributed hydrologic simulation, along with subsequent parameter optimization and visualization schemes. RWater provides platform independent web-based interface, flexible data integration capacity, grid-based simulations, and user-extensibility. RWater uses RStudio to simulate hydrologic processes on raster based data obtained through conventional GIS pre-processing. The program integrates Shuffled Complex Evolution (SCE) algorithm for parameter optimization. Moreover, RWater enables users to produce different descriptive statistics and visualization of the outputs at different temporal resolutions. The applicability of RWater will be demonstrated by application on two watersheds in Indiana for multiple rainfall events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to identify whether activity modeling framework supports problem analysis and provides a traceable and tangible connection from the problem identification up to solution modeling. Methodology validation relied on a real problem from a Portuguese teaching syndicate (ASPE), regarding courses development and management. The study was carried out with a perspective to elaborate a complete tutorial of how to apply activity modeling framework to a real world problem. Within each step of activity modeling, we provided a summary elucidation of the relevant elements required to perform it, pointed out some improvements and applied it to ASPE’s real problem. It was found that activity modeling potentiates well structured problem analysis as well as provides a guiding thread between problem and solution modeling. It was concluded that activity-based task modeling is key to shorten the gap between problem and solution. The results revealed that the solution obtained using activity modeling framework solved the core concerns of our customer and allowed them to enhance the quality of their courses development and management. The principal conclusion was that activity modeling is a properly defined methodology that supports software engineers in problem analysis, keeping a traceable guide among problem and solution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a general framework for the analysis of animal telemetry data through the use of weighted distributions. It is shown that several interpretations of resource selection functions arise when constructed from the ratio of a use and availability distribution. Through the proposed general framework, several popular resource selection models are shown to be special cases of the general model by making assumptions about animal movement and behavior. The weighted distribution framework is shown to be easily extended to readily account for telemetry data that are highly auto-correlated; as is typical with use of new technology such as global positioning systems animal relocations. An analysis of simulated data using several models constructed within the proposed framework is also presented to illustrate the possible gains from the flexible modeling framework. The proposed model is applied to a brown bear data set from southeast Alaska.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Materials are inherently multi-scale in nature consisting of distinct characteristics at various length scales from atoms to bulk material. There are no widely accepted predictive multi-scale modeling techniques that span from atomic level to bulk relating the effects of the structure at the nanometer (10-9 meter) on macro-scale properties. Traditional engineering deals with treating matter as continuous with no internal structure. In contrast to engineers, physicists have dealt with matter in its discrete structure at small length scales to understand fundamental behavior of materials. Multiscale modeling is of great scientific and technical importance as it can aid in designing novel materials that will enable us to tailor properties specific to an application like multi-functional materials. Polymer nanocomposite materials have the potential to provide significant increases in mechanical properties relative to current polymers used for structural applications. The nanoscale reinforcements have the potential to increase the effective interface between the reinforcement and the matrix by orders of magnitude for a given reinforcement volume fraction as relative to traditional micro- or macro-scale reinforcements. To facilitate the development of polymer nanocomposite materials, constitutive relationships must be established that predict the bulk mechanical properties of the materials as a function of the molecular structure. A computational hierarchical multiscale modeling technique is developed to study the bulk-level constitutive behavior of polymeric materials as a function of its molecular chemistry. Various parameters and modeling techniques from computational chemistry to continuum mechanics are utilized for the current modeling method. The cause and effect relationship of the parameters are studied to establish an efficient modeling framework. The proposed methodology is applied to three different polymers and validated using experimental data available in literature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Point Distribution Models (PDM) are among the most popular shape description techniques and their usefulness has been demonstrated in a wide variety of medical imaging applications. However, to adequately characterize the underlying modeled population it is essential to have a representative number of training samples, which is not always possible. This problem is especially relevant as the complexity of the modeled structure increases, being the modeling of ensembles of multiple 3D organs one of the most challenging cases. In this paper, we introduce a new GEneralized Multi-resolution PDM (GEM-PDM) in the context of multi-organ analysis able to efficiently characterize the different inter-object relations, as well as the particular locality of each object separately. Importantly, unlike previous approaches, the configuration of the algorithm is automated thanks to a new agglomerative landmark clustering method proposed here, which equally allows us to identify smaller anatomically significant regions within organs. The significant advantage of the GEM-PDM method over two previous approaches (PDM and hierarchical PDM) in terms of shape modeling accuracy and robustness to noise, has been successfully verified for two different databases of sets of multiple organs: six subcortical brain structures, and seven abdominal organs. Finally, we propose the integration of the new shape modeling framework into an active shape-model-based segmentation algorithm. The resulting algorithm, named GEMA, provides a better overall performance than the two classical approaches tested, ASM, and hierarchical ASM, when applied to the segmentation of 3D brain MRI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sustaining irrigated agriculture to meet food production needs while maintaining aquatic ecosystems is at the heart of many policy debates in various parts of the world, especially in arid and semi-arid areas. Researchers and practitioners are increasingly calling for integrated approaches, and policy-makers are progressively supporting the inclusion of ecological and social aspects in water management programs. This paper contributes to this policy debate by providing an integrated economic-hydrologic modeling framework that captures the socio-economic and environmental effects of various policy initiatives and climate variability. This modeling integration includes a risk-based economic optimization model and a hydrologic water management simulation model that have been specified for the Middle Guadiana basin, a vulnerable drought-prone agro-ecological area with highly regulated river systems in southwest Spain. Namely, two key water policy interventions were investigated: the implementation of minimum environmental flows (supported by the European Water Framework Directive, EU WFD), and a reduction in the legal amount of water delivered for irrigation (planned measure included in the new Guadiana River Basin Management Plan, GRBMP, still under discussion). Results indicate that current patterns of excessive water use for irrigation in the basin may put environmental flow demands at risk, jeopardizing the WFD s goal of restoring the ?good ecological status? of water bodies by 2015. Conflicts between environmental and agricultural water uses will be stressed during prolonged dry episodes, and particularly in summer low-flow periods, when there is an important increase of crop irrigation water requirements. Securing minimum stream flows would entail a substantial reduction in irrigation water use for rice cultivation, which might affect the profitability and economic viability of small rice-growing farms located upstream in the river. The new GRBMP could contribute to balance competing water demands in the basin and to increase economic water productivity, but might not be sufficient to ensure the provision of environmental flows as required by the WFD. A thoroughly revision of the basin s water use concession system for irrigation seems to be needed in order to bring the GRBMP in line with the WFD objectives. Furthermore, the study illustrates that social, economic, institutional, and technological factors, in addition to bio-physical conditions, are important issues to be considered for designing and developing water management strategies. The research initiative presented in this paper demonstrates that hydro-economic models can explicitly integrate all these issues, constituting a valuable tool that could assist policy makers for implementing sustainable irrigation policies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Geographic knowledge discovery (GKD) is the process of extracting information and knowledge from massive georeferenced databases. Usually the process is accomplished by two different systems, the Geographic Information Systems (GIS) and the data mining engines. However, the development of those systems is a complex task due to it does not follow a systematic, integrated and standard methodology. To overcome these pitfalls, in this paper, we propose a modeling framework that addresses the development of the different parts of a multilayer GKD process. The main advantages of our framework are that: (i) it reduces the design effort, (ii) it improves quality systems obtained, (iii) it is independent of platforms, (iv) it facilitates the use of data mining techniques on geo-referenced data, and finally, (v) it ameliorates the communication between different users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The logical (or logic) formalism is increasingly used to model regulatory and signaling networks. Complementing these applications, several groups contributed various methods and tools to support the definition and analysis of logical models. After an introduction to the logical modeling framework and to several of its variants, we review here a number of recent methodological advances to ease the analysis of large and intricate networks. In particular, we survey approaches to determine model attractors and their reachability properties, to assess the dynamical impact of variations of external signals, and to consistently reduce large models. To illustrate these developments, we further consider several published logical models for two important biological processes, namely the differentiation of T helper cells and the control of mammalian cell cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The water stored in and flowing through the subsurface is fundamental for sustaining human activities and needs, feeding water and its constituents to surface water bodies and supporting the functioning of their ecosystems. Quantifying the changes that affect the subsurface water is crucial for our understanding of its dynamics and changes driven by climate change and other changes in the landscape, such as in land-use and water-use. It is inherently difficult to directly measure soil moisture and groundwater levels over large spatial scales and long times. Models are therefore needed to capture the soil moisture and groundwater level dynamics over such large spatiotemporal scales. This thesis develops a modeling framework that allows for long-term catchment-scale screening of soil moisture and groundwater level changes. The novelty in this development resides in an explicit link drawn between catchment-scale hydroclimatic and soil hydraulics conditions, using observed runoff data as an approximation of soil water flux and accounting for the effects of snow storage-melting dynamics on that flux. Both past and future relative changes can be assessed by use of this modeling framework, with future change projections based on common climate model outputs. By direct model-observation comparison, the thesis shows that the developed modeling framework can reproduce the temporal variability of large-scale changes in soil water storage, as obtained from the GRACE satellite product, for most of 25 large study catchments around the world. Also compared with locally measured soil water content and groundwater level in 10 U.S. catchments, the modeling approach can reasonably well reproduce relative seasonal fluctuations around long-term average values. The developed modeling framework is further used to project soil moisture changes due to expected future climate change for 81 catchments around the world. The future soil moisture changes depend on the considered radiative forcing scenario (RCP) but are overall large for the occurrence frequency of dry and wet events and the inter-annual variability of seasonal soil moisture. These changes tend to be higher for the dry events and the dry season, respectively, than for the corresponding wet quantities, indicating increased drought risk for some parts of the world.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The analysis of steel and composite frames has traditionally been carried out by idealizing beam-to-column connections as either rigid or pinned. Although some advanced analysis methods have been proposed to account for semi-rigid connections, the performance of these methods strongly depends on the proper modeling of connection behavior. The primary challenge of modeling beam-to-column connections is their inelastic response and continuously varying stiffness, strength, and ductility. In this dissertation, two distinct approaches—mathematical models and informational models—are proposed to account for the complex hysteretic behavior of beam-to-column connections. The performance of the two approaches is examined and is then followed by a discussion of their merits and deficiencies. To capitalize on the merits of both mathematical and informational representations, a new approach, a hybrid modeling framework, is developed and demonstrated through modeling beam-to-column connections. Component-based modeling is a compromise spanning two extremes in the field of mathematical modeling: simplified global models and finite element models. In the component-based modeling of angle connections, the five critical components of excessive deformation are identified. Constitutive relationships of angles, column panel zones, and contact between angles and column flanges, are derived by using only material and geometric properties and theoretical mechanics considerations. Those of slip and bolt hole ovalization are simplified by empirically-suggested mathematical representation and expert opinions. A mathematical model is then assembled as a macro-element by combining rigid bars and springs that represent the constitutive relationship of components. Lastly, the moment-rotation curves of the mathematical models are compared with those of experimental tests. In the case of a top-and-seat angle connection with double web angles, a pinched hysteretic response is predicted quite well by complete mechanical models, which take advantage of only material and geometric properties. On the other hand, to exhibit the highly pinched behavior of a top-and-seat angle connection without web angles, a mathematical model requires components of slip and bolt hole ovalization, which are more amenable to informational modeling. An alternative method is informational modeling, which constitutes a fundamental shift from mathematical equations to data that contain the required information about underlying mechanics. The information is extracted from observed data and stored in neural networks. Two different training data sets, analytically-generated and experimental data, are tested to examine the performance of informational models. Both informational models show acceptable agreement with the moment-rotation curves of the experiments. Adding a degradation parameter improves the informational models when modeling highly pinched hysteretic behavior. However, informational models cannot represent the contribution of individual components and therefore do not provide an insight into the underlying mechanics of components. In this study, a new hybrid modeling framework is proposed. In the hybrid framework, a conventional mathematical model is complemented by the informational methods. The basic premise of the proposed hybrid methodology is that not all features of system response are amenable to mathematical modeling, hence considering informational alternatives. This may be because (i) the underlying theory is not available or not sufficiently developed, or (ii) the existing theory is too complex and therefore not suitable for modeling within building frame analysis. The role of informational methods is to model aspects that the mathematical model leaves out. Autoprogressive algorithm and self-learning simulation extract the missing aspects from a system response. In a hybrid framework, experimental data is an integral part of modeling, rather than being used strictly for validation processes. The potential of the hybrid methodology is illustrated through modeling complex hysteretic behavior of beam-to-column connections. Mechanics-based components of deformation such as angles, flange-plates, and column panel zone, are idealized to a mathematical model by using a complete mechanical approach. Although the mathematical model represents envelope curves in terms of initial stiffness and yielding strength, it is not capable of capturing the pinching effects. Pinching is caused mainly by separation between angles and column flanges as well as slip between angles/flange-plates and beam flanges. These components of deformation are suitable for informational modeling. Finally, the moment-rotation curves of the hybrid models are validated with those of the experimental tests. The comparison shows that the hybrid models are capable of representing the highly pinched hysteretic behavior of beam-to-column connections. In addition, the developed hybrid model is successfully used to predict the behavior of a newly-designed connection.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background & Aims: The pharmacokinetics and pharmacodynamics of pegylated-interferon-alpha-2a (PEG-IFN) have not been described in HCV/HIV co-infected patients. We sought to estimate the pharmacokinetics and pharmacodynamics of PEG-IFN and determine whether these parameters predict treatment outcome. Methods: Twenty-six HCV/human immunodeficiency virus (HIV)-co-infected patients were treated with a 48-week regimen of PEG-IFN (180 mu g/week) plus ribavirin (11 mg/kg/day). HCV RNA and PEG-IFN concentrations were obtained from samples collected until week 12. A modeling framework that includes pharmacokinetic and pharmacodynamic parameters was developed. Results: Five patients discontinued treatment. Seven patients achieved a sustained virological response (SVR). PEG-IFN concentrations at day 8 were similar to steady-state levels (p = 0.15) and overall pharmacokinetic parameters were similar in SVRs and non-SVRs. The maximum PEG-IFN effectiveness during the first PEG-IFN dose and the HCV-infected cell loss rate (delta), were significantly higher in SVRs compared to non-SVRs (median 95% vs. 86% [p = 0.013], 0.27 vs. 0.11 day(-1) [p = 0.006], respectively). Patients infected with HCV genotype 1 had a significantly lower average first-week PEG-IFN effectiveness (median 70% vs. 88% [p = 0.043]), however, 4- to 12-week PEG-IFN effectiveness was not significantly different compared to those with genotype 3 (p = 0.114). Genotype 1 had a significantly lower delta compared to genotype 3 (median 0.14 vs. 0.23 day(-1) [p = 0.021]). The PEG-IFN concentration that decreased HCV production by 50% (EC(50)) was lower in genotype 3 compared to genotype 1 (median 1.3 vs. 3.4 [p = 0.034]). Conclusions: Both the HCV-infected cell loss rate (delta) and the maximum effectiveness of the first dose of PEG-IFN-alpha-2a characterised HIV co-infected patients and were highly predictive of SVR. Further studies are needed to validate these viral kinetic parameters as early on-treatment prognosticators of response in patients with HCV and HIV. (C) 2010 European Association for the Study of the Liver. Published by Elsevier B.V. All rights reserved.