933 resultados para model driven system, semantic representation, semantic modeling, enterprise system development


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Doutor em Engenharia Electrotécnica e de Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To cope with modernity, the interesting of having a fully automated house has been increasing over the years, as technology evolves and as our lives become more stressful and overloaded. An automation system provides a way to simplify some daily tasks, allowing us to have more spare time to perform activities where we are really needed. There are some systems in this domain that try to implement these characteristics, but this kind of technology is at its early stages of evolution being that it is still far away of empowering the user with the desired control over a habitation. The reason is that the mentioned systems miss some important features such as adaptability, extension and evolution. These systems, developed from a bottom-up approach, are often tailored for programmers and domain experts, discarding most of the times the end users that remain with unfinished interfaces or products that they have difficulty to control. Moreover, complex behaviors are avoided, since they are extremely difficult to implement mostly due to the necessity of handling priorities, conflicts and device calibration. Besides, these solutions are only reachable at very high costs, yet they still have the limitation of being difficult to configure by non-technical people once in runtime operation. As a result, it is necessary to create a tool that allows the execution of several automated actions, with an interface that is easy to use but at the same time supports all the main features of this domain. It is also desirable that this tool is independent of the hardware so it can be reused, thus a Model Driven Development approach (MDD) is the ideal option, as it is a method that follows those principles. Since the automation domain has some very specific concepts, the use of models should be combined with a Domain Specific Language (DSL). With these two methods, it is possible to create a solution that is adapted to the end users, but also to domain experts and programmers due to the several levels of abstraction that can be added to diminish the complexity of use. The aim of this thesis is to design a Domain Specific Language (DSL) that uses the Model Driven Development approach (MDD), with the purpose of supporting Home Automation (HA) concepts. In this implementation, the development of simple and complex scenarios should be supported and will be one of the most important concerns. This DSL should also support other significant features in this domain, such as the ability to schedule tasks, which is something that is limited in the current existing solutions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En aquest treball, s'introduiran dos de les metodologies de desenvolupament dirigides per models més significatives: Model Driven Architecture (MDA) i Domain Specific Modeling (DSM). Així mateix, es presentarà un estudi comparatiu d'algunes de les diferents eines existents actualment al mercat que els hi donen suport.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fractal mathematics has been used to characterize water and solute transport in porous media and also to characterize and simulate porous media properties. The objective of this study was to evaluate the correlation between the soil infiltration parameters sorptivity (S) and time exponent (n) and the parameters dimension (D) and the Hurst exponent (H). For this purpose, ten horizontal columns with pure (either clay or loam) and heterogeneous porous media (clay and loam distributed in layers in the column) were simulated following the distribution of a deterministic Cantor Bar with fractal dimension H" 0.63. Horizontal water infiltration experiments were then simulated using Hydrus 2D software. The sorptivity (S) and time exponent (n) parameters of the Philip equation were estimated for each simulation, using the nonlinear regression procedure of the statistical software package SAS®. Sorptivity increased in the columns with the loam content, which was attributed to the relation of S with the capillary radius. The time exponent estimated by nonlinear regression was found to be less than the traditional value of 0.5. The fractal dimension estimated from the Hurst exponent was 17.5 % lower than the fractal dimension of the Cantor Bar used to generate the columns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The critical behavior of a system constituted by molecules with a preferred symmetry axis is studied by means of a Monte Carlo simulation of a simplified two-dimensional model. The system exhibits two phase transitions, associated with the vanishing of the positional order of the center of mass of the molecules and with the orientational order of the symmetry axis. The evolution of the order parameters and the specific heat is also studied. The transition associated with the positional degrees of freedom is found to change from a second-order to a first-order behavior when the two phase transitions are close enough, due to the coupling with the orientational degrees of freedom. This fact is qualitatively compared with similar results found in pure liquid crystals and liquid-crystal mixtures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: Ocular anatomy and radiation-associated toxicities provide unique challenges for external beam radiation therapy. For treatment planning, precise modeling of organs at risk and tumor volume are crucial. Development of a precise eye model and automatic adaptation of this model to patients' anatomy remain problematic because of organ shape variability. This work introduces the application of a 3-dimensional (3D) statistical shape model as a novel method for precise eye modeling for external beam radiation therapy of intraocular tumors. METHODS AND MATERIALS: Manual and automatic segmentations were compared for 17 patients, based on head computed tomography (CT) volume scans. A 3D statistical shape model of the cornea, lens, and sclera as well as of the optic disc position was developed. Furthermore, an active shape model was built to enable automatic fitting of the eye model to CT slice stacks. Cross-validation was performed based on leave-one-out tests for all training shapes by measuring dice coefficients and mean segmentation errors between automatic segmentation and manual segmentation by an expert. RESULTS: Cross-validation revealed a dice similarity of 95% ± 2% for the sclera and cornea and 91% ± 2% for the lens. Overall, mean segmentation error was found to be 0.3 ± 0.1 mm. Average segmentation time was 14 ± 2 s on a standard personal computer. CONCLUSIONS: Our results show that the solution presented outperforms state-of-the-art methods in terms of accuracy, reliability, and robustness. Moreover, the eye model shape as well as its variability is learned from a training set rather than by making shape assumptions (eg, as with the spherical or elliptical model). Therefore, the model appears to be capable of modeling nonspherically and nonelliptically shaped eyes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Manet security has a lot of open issues. Due to its character-istics, this kind of network needs preventive and corrective protection. Inthis paper, we focus on corrective protection proposing an anomaly IDSmodel for Manet. The design and development of the IDS are consideredin our 3 main stages: normal behavior construction, anomaly detectionand model update. A parametrical mixture model is used for behav-ior modeling from reference data. The associated Bayesian classi¯cationleads to the detection algorithm. MIB variables are used to provide IDSneeded information. Experiments of DoS and scanner attacks validatingthe model are presented as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract The solvability of the problem of fair exchange in a synchronous system subject to Byzantine failures is investigated in this work. The fair exchange problem arises when a group of processes are required to exchange digital items in a fair manner, which means that either each process obtains the item it was expecting or no process obtains any information on, the inputs of others. After introducing a novel specification of fair exchange that clearly separates safety and liveness, we give an overview of the difficulty of solving such a problem in the context of a fully-connected topology. On one hand, we show that no solution to fair exchange exists in the absence of an identified process that every process can trust a priori; on the other, a well-known solution to fair exchange relying on a trusted third party is recalled. These two results lead us to complete our system model with a flexible representation of the notion of trust. We then show that fair exchange is solvable if and only if a connectivity condition, named the reachable majority condition, is satisfied. The necessity of the condition is proven by an impossibility result and its sufficiency by presenting a general solution to fair exchange relying on a set of trusted processes. The focus is then turned towards a specific network topology in order to provide a fully decentralized, yet realistic, solution to fair exchange. The general solution mentioned above is optimized by reducing the computational load assumed by trusted processes as far as possible. Accordingly, our fair exchange protocol relies on trusted tamperproof modules that have limited communication abilities and are only required in key steps of the algorithm. This modular solution is then implemented in the context of a pedagogical application developed for illustrating and apprehending the complexity of fair exchange. This application, which also includes the implementation of a wide range of Byzantine behaviors, allows executions of the algorithm to be set up and monitored through a graphical display. Surprisingly, some of our results on fair exchange seem contradictory with those found in the literature of secure multiparty computation, a problem from the field of modern cryptography, although the two problems have much in common. Both problems are closely related to the notion of trusted third party, but their approaches and descriptions differ greatly. By introducing a common specification framework, a comparison is proposed in order to clarify their differences and the possible origins of the confusion between them. This leads us to introduce the problem of generalized fair computation, a generalization of fair exchange. Finally, a solution to this new problem is given by generalizing our modular solution to fair exchange

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaatimusmäärittelyn tavoitteena on luoda halutun järjestelmän kokonaisen, yhtenäisen vaatimusluettelon vaatimusten määrittämiseksi käsitteellisellä tasolla. Liiketoimintaprosessien mallintaminen on varsin hyödyllinen vaatimusmäärittelyn varhaisissa vaiheissa. Tämä työ tutkii liiketoimintaprosessien mallintamista tietojärjestelmien kehittämistä varten. Nykyään on olemassa erilaisia liiketoimintaprosessien mallintamiseen tarkoitettuja tekniikoita. Tämä työ tarkastaa liiketoimintaprosessien mallintamisen periaatteet ja näkökohdat sekä eri mallinnustekniikoita. Uusi menetelmä, joka on suunniteltu erityisesti pienille ja keskisuurille ohjelmistoprojekteille, on kehitetty prosessinäkökohtien ja UML-kaavioiden perusteella.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Yhtenäistetty mallinnuskieli, Unified Modeling Language (UML), on saavuttanut ohjelmistoteollisuudessa defacto standardin mallinnuskielen aseman. UML:n pääasiallinen käyttökohde on ollut ohjelmistojärjestelmien mallinnus, mutta sitä on sovellettu myös muillakin ongelma-alueilla, kuten erilaisten prosessien mallinnuksessa. Tässä diplomityössä mallinnetaan eräs betoniaseman ohjausjärjestelmä käyttäen UML:ää. Työssä perehdytään alan kirjallisuuden avulla siihen, miten teollisuus on hyödyntänyt UML:ää prosessien ohjausjärjestelmien mallinnuksessa. Kirjallisuudesta saatua tietoa sovelletaan betoniaseman ohjausjärjestelmän mallinnuksessa. Luotua mallia analysoidaan sen oikeellisuuden ja käytettävyyden perusteella. Työssä havaittiin, että UML soveltuu hyvin betoniaseman ohjausjärjestelmän kaltaisen teollisuusprosessin ohjauksen mallinnukseen. UML-mallilla voidaan kuvata järjestelmän rakenne ja toiminta kattavasti. Luotua mallia voidaan hyödyntää suoraan ohjausjärjestelmän jatkokehityksessä. Julkista tutkimustietoa aiheesta on kuitenkin niukasti saatavilla, joten lisätarve julkiselle tutkimukselle on olemassa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tutkimuksen päätavoite oli kehittää suorituskyvyn analysointijärjestelmä metalliteollisuuden alihankintaa suorittavalle pk-yritykselle. Lisäksi tutkittiin toimintatapoja, jotka edesauttavat menestyksekkään analysointijärjestelmän rakentamista. Tutkimuksessa käsiteltiin myös mittausjärjestelmän hyötyjä ja haittoja pk-yritykselle. Tutkimuksen teoreettisessa osassa käsitellään yleisesti suorituskykyä, esitellään erilaisia suorituskyvyn analysointijärjestelmiä ja selvitetään järjestelmien eroja. Lisäksi esitellään erilaisia prosessimalleja, joiden avulla yritys voi rakentaa suorituskyvyn analysointijärjestelmän. Tutkimuksen empiirisessä osassa esitellään yrityksessä läpikäyty prosessimalli, jonka avulla rakennettiin suorituskyvyn analysointijärjestelmä. Yrityksessä läpikäydyn prosessin pohjana toimi SAKE-prosessimalli, mutta ideoita haettiin myös Toivasen mallista. Tutkimuksen tuloksena syntyi teoreettinen paketti suorituskyvyn analysoinnista ja malli suorituskyvyn analysointijärjestelmästä. Teoreettinen paketti toimi hyvänä pohjana ja tarjosi taustatietoa aiheesta projektissa mukana olleille henkilöille. Tutkimuksen tuloksena syntynyt malli soveltuu parhaiten metallin mekaanista työstöä suorittavalle yritykselle, mutta myös muut yritykset voivat ottaa tästä mallia. Hyödyllisimmäksi näkökulmaksi voi nostaa itse prosessin, jonka avulla päästään tarkastelemaan yrityksen menestymisen taustalla olevia tekijöitä.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wheat (Triticum aestivum L.) blast caused by Pyricularia grisea is a new disease in Brazil and no resistant cultivars are available. The interactions between temperature and wetness durations have been used in many early warning systems. Hence, growth chamber experiments to assess the effect of different temperatures (10, 15, 20, 25, 30 and 35ºC) and the duration of spike-wetness (0, 5, 10, 15, 20, 25, 30, 35 and 40 hours) on the intensity of blast in cultivar BR23 were carried out. Each temperature formed an experiment and the duration of wetness the treatments. The highest blast intensity was observed at 30°C and increased as the duration of the wetting period increased while the lowest occurred at 25°C and 10 hours of spike wetness. Regardless of the temperature, no symptoms occurred when the wetting period was less than 10 hours but at 25°C and a 40 h wetting period blast intensity exceeded 85%. These variations in blast intensity as a function of temperature are explained by a generalized beta model and as a function of the duration of spike wetness by the Gompertz model. Disease intensity was modeled as a function of both temperature and the durations of spike wetness and the resulting equation provided a precise description of the response of P. grisea to temperatures and the durations of spike wetness. This model was used to construct tables that can be used to predict the intensity of P. grisea wheat blast based on the temperatures and the durations of wheat spike wetness obtained in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work evaluated eight hypsometric models to represent tree height-diameter relationship, using data obtained from the scaling of 118 trees and 25 inventory plots. Residue graphic analysis and percent deviation mean criteria, qui-square test precision, residual standard error between real and estimated heights and the graybill f test were adopted. The identity of the hypsometric models was also verified by applying the F(Ho) test on the plot data grouped to the scaling data. It was concluded that better accuracy can be obtained by using the model prodan, with h and d1,3 data measured in 10 trees by plots grouped into these scaling data measurements of even-aged forest stands.