818 resultados para Model driven development
Resumo:
现代软件开发项目规模的日益增大和复杂度的日益提高要求软件组织采用更有效的软件开发方法。学术界和工业界提出了一系列的软件工程方法,其主要目的是提高软件产品质量,保障项目进度,降低项目成本,减少维护费用。测试驱动开发(Test-Driven Development TDD),作为敏捷开发中一种非常流行的方法,经过最近十来年的发展,无论在工业界还是在学术界均有大量成功应用案例;同时测试驱动开发的思想也已经日益为越来越多的软件开发组织和开发者所接受。尽管测试驱动开发(TDD)可以提高软件产品的质量和软件开发人员的生产率,但是其实施难度令众多软件开发组织对于测试驱动开发方法望而却步。同时在实际项目中由于进度压力,严格的测试驱动开发往往不能自始至中贯彻执行,而且在满足软件产品质量要求的条件下,我们没有必要为了达到零缺陷而对每个开发模块进行测试驱动开发。因此我们有必要对于软件组织是否采用TDD方式开发,对于哪些模块采用TDD方式开发进行有效合理的评价,以便为软件项目管理人员提供决策依据。软件过程仿真,无疑是一种低成本,且较为科学的方法,它能够在已有信息的基础上,提供科学的决策依据。本文以测试驱动开发大量的经验研究结论为依据,提出一种基于过程模型随机仿真的TDD模块选取方法。该方法以随机进程代数为建模工具,通过用例度量软件模块的复杂性,来获取模型的仿真参数,进行仿真并得到该模型的仿真结果。最终采用TDD模块选取算法来分析仿真结果,得出最佳TDD实施策略,可以为项目经理提供合理的TDD实施策略。本文主要研究内容包括第一、提出一种度量软件模块复杂性的简易计算方法,该方法从软件模块的内部复杂度和外部复杂度出发,引入结构熵的概念;用结构熵度量软件外部复杂度,用例中的事件流来度量内部复杂度,最终得到模块复杂度和该模块相应的仿真参数。第二、对于测试驱动开发过程和传统的软件开发过程,建立随机进程代数仿真模型。经过比较多种仿真方法后,选取Gibson-Bruck随机仿真方法对于软件过程进行仿真。并分析了采用该算法的合理性。第三、提出一种基于过程模型随机仿真的TDD实施模块选取算法,从仿真结果出发,为项目经理提供合理的决策支持。同时为了便于本方法的应用,设计实现了基于过程模型随机仿真的TDD模块选取系统。
Resumo:
首先分析了企业对数据集成的需求,并指出数据集成所面临的主要问题.针对这些问题,提出了以企业核心数据模型为驱动的数据集成方法,而且给出了该集成方法的一种实现,称之为企业数据集成平台,并以炼化企业的具体应用为例,论述了该方法对于解决企业数据集成问题的有效性.
Resumo:
模型驱动体系是OMG提出的一种IT系统描述方法,是互操作性标准进一步的发展。文章介绍了模型驱动体系的基本概念、核心基础构造及其模型体系,并介绍了当前的应用现状。
Resumo:
流程企业生产过程反映了企业的工艺路线、资源配置及其制造能力,生产过程的稳定运行影响着产品生命周期的各个阶段,因此进行生产过程的实时监控非常必要。生产过程中的大量装置所包含的静态数据和实时的动态数据构成了生产过程监控的数据基础,因此实现生产过程的实时监控依赖于基于装置对象的生产过程的快速配置。 本文以国家高技术研究发展计划(863计划)课题“面向全流程多尺度的综合自动化集成应用平台”为背景,针对流程企业生产过程监控的具体需求,研究并开发了基于模型驱动的实时数据发布系统,实现了基于装置对象的生产过程的快速配置。论文主要工作如下: 1.给出了基于数据驱动机制的装置对象描述模型,建立了面向流程企业生产过程的装置资源库。在此基础上给出了一种基于装置对象描述模型的生产过程快速配置方法。 2.针对流程企业生产过程监控的具体需求,引入XML-QL与复合事件的时序逻辑模型,提出了一个基于XML和复合事件的发布/订阅模型,给出了基于订阅树的复合事件匹配算法。最后在生产过程模型的基础上设计了一个实时数据发布框架。 3.基于.NET平台,设计开发了基于装置对象模型驱动的实时数据发布系统,并基于某石化企业的核心数据进行了应用验证,取得了良好的效果。 关键词:生产过程监控;模型驱动;发布/订阅;复合事件
Resumo:
China's cultivated land has been undergoing dramatic changes along with its rapidly growing economy and population. The impacts of land use transformation on food production at the national scale, however, have been poorly understood due to the lack of detailed spatially explicit agricultural productivity information on cropland change and crop productivity. This study evaluates the effect of the cropland transformation on agricultural productivity by combining the land use data of China for the period of 1990-2000 from TM images and a satellite-based NPP (net primary production) model driven with NOAH/AVHRR data. The cropland area of China has a net increase of 2.79 Mha in the study period, which causes a slightly increased agricultural productivity (6.96 Mt C) at the national level. Although the newly cultivated lands compensated for the loss from urban expansion, but the contribution to production is insignificant because of the low productivity. The decrease in crop production resulting from urban expansion is about twice of that from abandonment of arable lands to forests and grasslands. The productivity of arable lands occupied by urban expansion was 80% higher than that of the newly cultivated lands in the regions with unfavorable natural conditions. Significance of cropland transformation impacts is spatially diverse with the differences in land use change intensity and land productivity across China. The increase in arable land area and yet decline in land quality may reduce the production potential and sustainability of China's agro-ecosystems. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Five diagnostic experiments with a 3D baroclinic hydrodynamic and sediment transport model ECOMSED in couple with the third generation wave model SWAN and the Grant-Madsen bottom boundary layer model driven by the monthly sediment load of the Yellow River, were conducted to separately diagnose effects of different hydrodynamic factors on transport of suspended sediment discharged from the Yellow River in the Bohai Sea. Both transport and spatio-temporal distribution of suspended sediment concentration in the Bohai Sea were numerially simulated. It could be concluded that suspended sediment discharged from the Yellow River cannot be delivered in long distance under the condition of tidal current. Almost all of sediments from the Yellow River are deposited outside the delta under the condition of wind-driven current, and only very small of them are transported faraway. On the basis of wind forcing, sediments from the Yellow River are mainly transported north-northwestward, and others which are first delivered to the Laizhou Bay are continuously moved northward. An obvious 3D structure characteristic of sediment transport is produced in the wind-driven and tide-induced residual circulation condition. Transport patterns at all layers are generally consistent with circulation structure, but there is apparent deviation between the depth-averaged sediment flux and the circulation structure. The phase of temporal variation of sediment concentration is consistent with that of the bottom shear stress, both of which are proved to have a ten-day cycle in wave and current condition.
Resumo:
We detected the responses of summertime extreme wave heights (H-top10, average of the highest 10% of significant wave heights in June, July and August) to local climate variations in the East China Sea by applying an empirical orthogonal function analysis to Htop10 derived from the WAVEWATCH- III wave model driven by 6 hourly sea surface wind fields from ERA-40 reanalysis over the period 1958-2002. Decreases in H-top10 in the northern East China Sea ( Yellow Sea) correspond to attenuation of the East Asian Summer Monsoon, while increases in the south are primarily due to enhancement of tropical cyclone activities in the western North Pacific.
Resumo:
This paper is an important part of the national "863" topic :"Reservoir dynamic model, the development environment and the forecast of remaining oil". In this paper, multi-theory, method and technology are synthesized, and sufficiently use the computer method. We use unifies of qualitative and quota, unifies of macroscopic and microscopic, unifies of dynamic and quiescent description of reservoir, unifies of comprehensive research about reservoir and physical mathematical simulation, unifies of three-dimensional and four-dimensional description of reservoir to research the reservoir of channel sand in Gudao oilfield. and we do some research about the last 10 years of the more than 30 year high pressure water injection and polymer water flooding development, dynamic changes and geologic hazard of reservoir fluid field. It discloses the distribution, genesis and controlling factors. The main innovation achievement and the understanding are: we built-up the framework of the strata and structure, and found genetic type, spatial distribution and aeolotropism of the upper Guantao member. We form the macroscopic and microscopic reservoir model of dynamic evolution, disclose the character, distribution of the macroscopic and microscopic parameter,and the relationship with remaining oil. Next we built-up the model about hydrosialite, and find the styles, group of styles, formation mechanism and controlling factors of the reservoir, disclose the affection of the hydrosialite to remaining oil, pollution of the production environment of oilfield and geologic hazard. The geologic hazards are classified to 8 styles first time, and we disclose the character, distribution law, formation mechanism and controlling factors of the geologic hazard. We built-up the model of the distribution of remaining oil in different periods of Gudao oilfield, and disclose the macroscopic and microscopic formation mechanism of remaining oil in different periods, forecast the distribution of the mobile remaining oil, and find that the main cause of the dynamic evolution of all the sub-models of reservoir fluid field is the geologic process of the reservoir development hydrodynamic force. We develop the reservoir fluid field, research of environment disaster and the description about the support theory, method and technology. The use of this theory in Gudao oilfield has obtained very good economic efficiency, and deepened and develops development geology about the continental facies fault-trough basin, and theory of geologic hazard.
Resumo:
Global positioning system (GPS) can not only provide precise service for navigation and timing, but also be used to investigate the ionospheric variation. From the GPS observations, we can obtain total electron content (TEC), so-called GPS TEC, which is used to characterize the ionospheric structure. This thesis mainly concerns about GPS TEC data processing and ionospheric climatological analysis as follows. Firstly, develop an algorithm for high-resolution global ionospheric TEC mapping. According to current algorithms in global TEC mapping, we propose a practical way to calibrate the original GPS TEC with the existing GIM results. We also finish global/local TEC mapping by model fitting with the processed GPS TEC data; in practice, we apply it into the local TEC mapping in Southeast of China and obtain some initial results. Next, suggest a new method to calculate equivalent ionospheric global electron content (GEC). We calculate such an equivalent GEC with the TEC data along the geographic longitude 120°E. With the climatological analysis, we can see that GEC climatological variation is mainly composed of three factors: solar cycle, annual and semiannual variations. Solar cycle variation is dominant among them, which indicates the most prominent influence; both annual and semiannual variations play a secondary role and are modulated by solar activity. We construct an empirical GEC model driven by solar activity and seasonal factors on the basis of partial correlation analysis. Generally speaking, our researches not only show that GPS is advantageous in now-casting ionospheric TEC as an important observation, but also show that GEC may become a new index to describe the solar influence on the global ionosphere since the great correlation between GEC and solar activity factor indicates the close relationship between the ionosphere and solar activity.
Resumo:
Ecological concern prompts poor and indigenous people of India to consider how a society can ensure both protection of nature and their rightful claim for a just and sustainable future. Previous discussions defended the environment while ignoring the struggles of the poor for sustenance and their religious traditions and ethical values. Mohandas Karamchand Gandhi addressed similar socio-ecological concerns by adopting and adapting traditional religious and ethical notions to develop strategies for constructive, engaged resistance. The dissertation research and analysis verifies the continued relevance of the Gandhian understanding of dharma (ethics) in contemporary India as a basis for developing eco-dharma (eco-ethics) to link closely development, ecology, and religious values. The method of this study is interpretive, analytical, and critical. Françoise Houtart’s social analytical method is used to make visible and to suggest how to overcome social tensions from the perspective of marginalized and exploited peoples in India. The Indian government's development initiatives create a nexus between the eco-crisis and economic injustice, and communities’ responses. The Chipko movement seeks to protect the Himalayan forests from commercial logging. The Narmada Bachao Andolan strives to preserve the Narmada River and its forests and communities, where dam construction causes displacement. The use of Gandhian approaches by these movements provides a framework for integrating ecological concerns with people's struggles for survival. For Gandhi, dharma is a harmony of satya (truth), ahimsa (nonviolence), and sarvodaya (welfare of all). Eco-dharma is an integral, communitarian, and ecologically sensitive ethical paradigm. The study demonstrates that the Gandhian notion of dharma, implemented through nonviolent satyagraha (firmness in promoting truth), can direct community action that promotes responsible economic structures and the well-being of the biotic community and the environment. Eco-dharma calls for solidarity, constructive resistance, and ecologically and economically viable communities. The dissertation recommends that for a sustainable future, India must combine indigenous, appropriate, and small- or medium-scale industries as an alternative model of development in order to help reduce systemic poverty while enhancing ecological well-being.
Resumo:
Preclinical toxicity testing in animal models is a cornerstone of the drug development process, yet it is often unable to predict adverse effects and tolerability issues in human subjects. Species-specific responses to investigational drugs have led researchers to utilize human tissues and cells to better estimate human toxicity. Unfortunately, human cell-derived models are imperfect because toxicity is assessed in isolation, removed from the normal physiologic microenvironment. Microphysiological modeling often referred to as 'organ-on-a-chip' or 'human-on-a-chip' places human tissue into a microfluidic system that mimics the complexity of human in vivo physiology, thereby allowing for toxicity testing on several cell types, tissues, and organs within a more biologically relevant environment. Here we describe important concepts when developing a repro-on-a-chip model. The development of female and male reproductive microfluidic systems is critical to sex-based in vitro toxicity and drug testing. This review addresses the biological and physiological aspects of the male and female reproductive systems in vivo and what should be considered when designing a microphysiological human-on-a-chip model. Additionally, interactions between the reproductive tract and other systems are explored, focusing on the impact of factors and hormones produced by the reproductive tract and disease pathophysiology.
Resumo:
In this paper, we describe a study of the abstract thinking skills of a group of students studying object-oriented modelling as part of a Masters course. Abstract thinking has long been considered a core skill for computer scientists. This study is part of attempts to gather evidence about the link between abstract thinking skills and success in the Computer Science discipline. The results of this study show a positive correlation between the scores of the students in the abstract thinking test with the marks achieved in the module. However, the small numbers in the study mean that wider research is needed.
Resumo:
O presente trabalho de dissertação teve como objetivo a implementação de metodologias de Lean Management e avaliação do seu impacto no processo de Desenvolvimento de Produto. A abordagem utilizada consistiu em efetuar uma revisão da literatura e levantamento do Estado da Arte para obter a fundamentação teórica necessária à implementação de metodologias Lean. Prosseguiu com o levantamento da situação inicial da organização em estudo ao nível das atividades de desenvolvimento de produto, práticas de gestão documental e operacional e ainda de atividades de suporte através da realização de inquéritos e medições experimentais. Este conhecimento permitiu criar um modelo de referência para a implementação de Lean Management nesta área específica do desenvolvimento de produto. Após implementado, este modelo foi validado pela sua experimentação prática e recolha de indicadores. A implementação deste modelo de referência permitiu introduzir na Unidade de Desenvolvimento de Produto e Sistemas (DPS) da organização INEGI, as bases do pensamento Lean, contribuindo para a criação de um ambiente de Respeito pela Humanidade e de Melhoria Contínua. Neste ambiente foi possível obter ganhos qualitativos e quantitativos nas várias áreas em estudo, contribuindo de forma global para um aumento da eficiência e eficácia da DPS. Prevê-se que este aumento de eficiência represente um aumento da capacidade instalada na Organização, pela redução anual de 2290 horas de desperdício (6.5% da capacidade total da unidade) e pela redução significativa em custos operacionais. Algumas das implementações de melhoria propostas no decorrer deste trabalho, após verificado o seu sucesso, extravasaram a unidade em estudo e foram aplicadas transversalmente à da organização. Foram também obtidos ganhos qualitativos, tais como a normalização de práticas de gestão documental e a centralização e agilização de fluxos de informação. Isso permitiu um aumento de qualidade dos serviços prestados pela redução de correções e retrabalho. Adicionalmente, com o desenvolvimento de uma nova ferramenta que permite a monitorização do estado atual dos projetos a nível da sua percentagem de execução (cumprimento de objetivos), prazos e custos, bem como a estimação das datas de conclusão dos projetos possibilitando o replaneamento do projeto bem como a detecção atempada de desvios. A ferramenta permite também a criação de um histórico que identifica o esforço horário associado à realização das atividades/tarefas das várias áreas de Desenvolvimento de Produto e desta forma pode ser usada como suporte à orçamentação futura de atividades similares. No decorrer do projeto, foram também criados os mecanismos que permitem o cálculo de indicadores das competências técnicas e motivações intrínsecas individuais da equipa DPS. Estes indicadores podem ser usados na definição por parte dos gestores dos projetos da composição das equipas de trabalho, dos executantes de tarefas individuais do projeto e dos destinatários de ações de formação. Com esta informação é expectável que se consiga um maior aproveitamento do potencial humano e como consequência um aumento do desempenho e da satisfação pessoal dos recursos humanos da organização. Este caso de estudo veio demonstrar que o potencial de melhoria dos processos associados ao desenvolvimento de produto através de metodologias de Lean Management é muito significativo, e que estes resultam em ganhos visíveis para a organização bem como para os seus elementos individualmente.
Resumo:
On présente une nouvelle approche de simulation pour la fonction de densité conjointe du surplus avant la ruine et du déficit au moment de la ruine, pour des modèles de risque déterminés par des subordinateurs de Lévy. Cette approche s'inspire de la décomposition "Ladder height" pour la probabilité de ruine dans le Modèle Classique. Ce modèle, déterminé par un processus de Poisson composé, est un cas particulier du modèle plus général déterminé par un subordinateur, pour lequel la décomposition "Ladder height" de la probabilité de ruine s'applique aussi. La Fonction de Pénalité Escomptée, encore appelée Fonction Gerber-Shiu (Fonction GS), a apporté une approche unificatrice dans l'étude des quantités liées à l'événement de la ruine été introduite. La probabilité de ruine et la fonction de densité conjointe du surplus avant la ruine et du déficit au moment de la ruine sont des cas particuliers de la Fonction GS. On retrouve, dans la littérature, des expressions pour exprimer ces deux quantités, mais elles sont difficilement exploitables de par leurs formes de séries infinies de convolutions sans formes analytiques fermées. Cependant, puisqu'elles sont dérivées de la Fonction GS, les expressions pour les deux quantités partagent une certaine ressemblance qui nous permet de nous inspirer de la décomposition "Ladder height" de la probabilité de ruine pour dériver une approche de simulation pour cette fonction de densité conjointe. On présente une introduction détaillée des modèles de risque que nous étudions dans ce mémoire et pour lesquels il est possible de réaliser la simulation. Afin de motiver ce travail, on introduit brièvement le vaste domaine des mesures de risque, afin d'en calculer quelques unes pour ces modèles de risque. Ce travail contribue à une meilleure compréhension du comportement des modèles de risques déterminés par des subordinateurs face à l'éventualité de la ruine, puisqu'il apporte un point de vue numérique absent de la littérature.
Resumo:
La transformation de modèles consiste à transformer un modèle source en un modèle cible conformément à des méta-modèles source et cible. Nous distinguons deux types de transformations. La première est exogène où les méta-modèles source et cible représentent des formalismes différents et où tous les éléments du modèle source sont transformés. Quand elle concerne un même formalisme, la transformation est endogène. Ce type de transformation nécessite généralement deux étapes : l’identification des éléments du modèle source à transformer, puis la transformation de ces éléments. Dans le cadre de cette thèse, nous proposons trois principales contributions liées à ces problèmes de transformation. La première contribution est l’automatisation des transformations des modèles. Nous proposons de considérer le problème de transformation comme un problème d'optimisation combinatoire où un modèle cible peut être automatiquement généré à partir d'un nombre réduit d'exemples de transformations. Cette première contribution peut être appliquée aux transformations exogènes ou endogènes (après la détection des éléments à transformer). La deuxième contribution est liée à la transformation endogène où les éléments à transformer du modèle source doivent être détectés. Nous proposons une approche pour la détection des défauts de conception comme étape préalable au refactoring. Cette approche est inspirée du principe de la détection des virus par le système immunitaire humain, appelée sélection négative. L’idée consiste à utiliser de bonnes pratiques d’implémentation pour détecter les parties du code à risque. La troisième contribution vise à tester un mécanisme de transformation en utilisant une fonction oracle pour détecter les erreurs. Nous avons adapté le mécanisme de sélection négative qui consiste à considérer comme une erreur toute déviation entre les traces de transformation à évaluer et une base d’exemples contenant des traces de transformation de bonne qualité. La fonction oracle calcule cette dissimilarité et les erreurs sont ordonnées selon ce score. Les différentes contributions ont été évaluées sur d’importants projets et les résultats obtenus montrent leurs efficacités.