906 resultados para model-based reasoning processes


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software development guidelines are a set of rules which can help improve the quality of software. These rules are defined on the basis of experience gained by the software development community over time. This paper discusses a set of design guidelines for model-based development of complex real-time embedded software systems. To be precise, we propose nine design conventions, three design patterns and thirteen antipatterns for developing UML-RT models. These guidelines have been identified based on our analysis of around 100 UML-RT models from industry and academia. Most of the guidelines are explained with the help of examples, and standard templates from the current state of the art are used for documenting the design rules.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emerging cybersecurity vulnerabilities in supervisory control and data acquisition (SCADA) systems are becoming urgent engineering issues for modern substations. This paper proposes a novel intrusion detection system (IDS) tailored for cybersecurity of IEC 61850 based substations. The proposed IDS integrates physical knowledge, protocol specifications and logical behaviours to provide a comprehensive and effective solution that is able to mitigate various cyberattacks. The proposed approach comprises access control detection, protocol whitelisting, model-based detection, and multi-parameter based detection. This SCADA-specific IDS is implemented and validated using a comprehensive and realistic cyber-physical test-bed and data from a real 500kV smart substation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes a novel demand response model using a fuzzy subtractive cluster approach. The model development provides support to domestic consumer decisions on controllable loads management, considering consumers' consumption needs and the appropriate load shape or rescheduling in order to achieve possible economic benefits. The model based on fuzzy subtractive clustering method considers clusters of domestic consumption covering an adequate consumption range. Analysis of different scenarios is presented considering available electric power and electric energy prices. Simulation results are presented and conclusions of the proposed demand response model are discussed. (C) 2016 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

According to law number 12.715/2012, Brazilian government instituted guidelines for a program named Inovar-Auto. In this context, energy efficiency is a survival requirement for Brazilian automotive industry from September 2016. As proposed by law, energy efficiency is not going to be calculated by models only. It is going to be calculated by the whole universe of new vehicles registered. In this scenario, the composition of vehicles sold in market will be a key factor on profits of each automaker. Energy efficiency and its consequences should be taken into consideration in all of its aspects. In this scenario, emerges the following question: which is the efficiency curve of one automaker for long term, allowing them to adequate to rules, keep balancing on investment in technologies, increasing energy efficiency without affecting competitiveness of product lineup? Among several variables to be considered, one can highlight the analysis of manufacturing costs, customer value perception and market share, which characterizes this problem as a multi-criteria decision-making. To tackle the energy efficiency problem required by legislation, this paper proposes a framework of multi-criteria decision-making. The proposed framework combines Delphi group and Analytic Hierarchy Process to identify suitable alternatives for automakers to incorporate in main Brazilian vehicle segments. A forecast model based on artificial neural networks was used to estimate vehicle sales demand to validate expected results. This approach is demonstrated with a real case study using public vehicles sales data of Brazilian automakers and public energy efficiency data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A variety of physical and biomedical imaging techniques, such as digital holography, interferometric synthetic aperture radar (InSAR), or magnetic resonance imaging (MRI) enable measurement of the phase of a physical quantity additionally to its amplitude. However, the phase can commonly only be measured modulo 2π, as a so called wrapped phase map. Phase unwrapping is the process of obtaining the underlying physical phase map from the wrapped phase. Tile-based phase unwrapping algorithms operate by first tessellating the phase map, then unwrapping individual tiles, and finally merging them to a continuous phase map. They can be implemented computationally efficiently and are robust to noise. However, they are prone to failure in the presence of phase residues or erroneous unwraps of single tiles. We tried to overcome these shortcomings by creating novel tile unwrapping and merging algorithms as well as creating a framework that allows to combine them in modular fashion. To increase the robustness of the tile unwrapping step, we implemented a model-based algorithm that makes efficient use of linear algebra to unwrap individual tiles. Furthermore, we adapted an established pixel-based unwrapping algorithm to create a quality guided tile merger. These original algorithms as well as previously existing ones were implemented in a modular phase unwrapping C++ framework. By examining different combinations of unwrapping and merging algorithms we compared our method to existing approaches. We could show that the appropriate choice of unwrapping and merging algorithms can significantly improve the unwrapped result in the presence of phase residues and noise. Beyond that, our modular framework allows for efficient design and test of new tile-based phase unwrapping algorithms. The software developed in this study is freely available.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, tool support is addressed for the combined disciplines of Model-based testing and performance testing. Model-based testing (MBT) utilizes abstract behavioral models to automate test generation, thus decreasing time and cost of test creation. MBT is a functional testing technique, thereby focusing on output, behavior, and functionality. Performance testing, however, is non-functional and is concerned with responsiveness and stability under various load conditions. MBPeT (Model-Based Performance evaluation Tool) is one such tool which utilizes probabilistic models, representing dynamic real-world user behavior patterns, to generate synthetic workload against a System Under Test and in turn carry out performance analysis based on key performance indicators (KPI). Developed at Åbo Akademi University, the MBPeT tool is currently comprised of a downloadable command-line based tool as well as a graphical user interface. The goal of this thesis project is two-fold: 1) to extend the existing MBPeT tool by deploying it as a web-based application, thereby removing the requirement of local installation, and 2) to design a user interface for this web application which will add new user interaction paradigms to the existing feature set of the tool. All phases of the MBPeT process will be realized via this single web deployment location including probabilistic model creation, test configurations, test session execution against a SUT with real-time monitoring of user configurable metric, and final test report generation and display. This web application (MBPeT Dashboard) is implemented with the Java programming language on top of the Vaadin framework for rich internet application development. The Vaadin framework handles the complicated web communications processes and front-end technologies, freeing developers to implement the business logic as well as the user interface in pure Java. A number of experiments are run in a case study environment to validate the functionality of the newly developed Dashboard application as well as the scalability of the solution implemented in handling multiple concurrent users. The results support a successful solution with regards to the functional and performance criteria defined, while improvements and optimizations are suggested to increase both of these factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent advent of new technologies has led to huge amounts of genomic data. With these data come new opportunities to understand biological cellular processes underlying hidden regulation mechanisms and to identify disease related biomarkers for informative diagnostics. However, extracting biological insights from the immense amounts of genomic data is a challenging task. Therefore, effective and efficient computational techniques are needed to analyze and interpret genomic data. In this thesis, novel computational methods are proposed to address such challenges: a Bayesian mixture model, an extended Bayesian mixture model, and an Eigen-brain approach. The Bayesian mixture framework involves integration of the Bayesian network and the Gaussian mixture model. Based on the proposed framework and its conjunction with K-means clustering and principal component analysis (PCA), biological insights are derived such as context specific/dependent relationships and nested structures within microarray where biological replicates are encapsulated. The Bayesian mixture framework is then extended to explore posterior distributions of network space by incorporating a Markov chain Monte Carlo (MCMC) model. The extended Bayesian mixture model summarizes the sampled network structures by extracting biologically meaningful features. Finally, an Eigen-brain approach is proposed to analyze in situ hybridization data for the identification of the cell-type specific genes, which can be useful for informative blood diagnostics. Computational results with region-based clustering reveals the critical evidence for the consistency with brain anatomical structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis considers a three- dimensional numerical model based on 3-D Navier— Stokes and continuity equations involving various wind speeds (North west), water surface levels, horizontal shier stresses, eddy viscosity, densities of oil and gas condensate- water mixture flows. The model is used to simulate the prediction of the surface movement of oil and gas condensate slicks from spill accident in the north coasts of Persian Gulf.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de Doutoramento, Ciências do Ambiente (Ordenamento do Território), 5 de Abril de 2013, Universidade dos Açores.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Einleitung: Notwendige Voraussetzung für die Entstehung von Zervixkarzinomen ist eine persistierende Infektion mit humanen Papillomaviren (HPV). Die HPV-Typen 16 und 18 verursachen mit etwa 70% den überwiegenden Teil der Zervixkarzinome. Seit 2006/2007 stehen zwei Impfstoffe gegen HPV 16 und 18 zur Verfügung. Fragestellung: Wie effektiv ist die HPV-Impfung hinsichtlich der Reduktion von Zervixkarzinomen bzw. ihren Vorstufen (CIN)? Stellt die HPV-Impfung eine kosteneffektive Ergänzung zur derzeitigen Screeningpraxis dar? Gibt es Unterschiede bezüglich der Kosten-Effektivität zwischen den beiden verfügbaren Impfstoffen? Sollte aus gesundheitsökonomischer Perspektive eine Empfehlung für den Einsatz der HPV-Impfung gegeben werden? Falls ja, welche Empfehlungen bezüglich der Ausgestaltung einer Impfstrategie lassen sich ableiten? Welche ethischen, sozialen und juristischen Implikationen sind zu berücksichtigen? Methoden: Basierend auf einer systematischen Literaturrecherche werden randomisierte kontrollierte Studien zur Wirksamkeit der HPV-Impfungen für die Prävention von Zervixkarzinomen bzw. deren Vorstufen, den zervikalen intraepithelialen Neoplasien, identifiziert. Gesundheitsökonomische Modellierungen werden zur Beantwortung der ökonomischen Fragestellungen herangezogen. Die Beurteilung der Qualität der medizinischen und ökonomischen Studien erfolgt mittels anerkannter Standards zur systematischen Bewertung wissenschaftlicher Studien Ergebnisse: Bei zu Studienbeginn HPV 16/18 negativen Frauen, die alle Impfdosen erhalten haben, liegt die Wirksamkeit der Impfungen gegen HPV 16/18-induzierten CIN 2 oder höher bei 98% bis 100%. Nebenwirkungen der Impfung sind vor allem mit der Injektion assoziierte Beschwerden (Rötungen, Schwellungen, Schmerzen). Es gibt keine signifikanten Unterschiede für schwerwiegende unerwünschte Ereignisse zwischen Impf- und Placebogruppe. Die Ergebnisse der Basisfallanalysen der gesundheitsökonomischen Modellierungen reichen bei ausschließlicher Berücksichtigung direkter Kostenkomponenten von ca. 3.000 Euro bis ca. 40.000 Euro pro QALY (QALY = Qualitätskorrigiertes Lebensjahr), bzw. von ca. 9.000 Euro bis ca. 65.000 Euro pro LYG (LYG = Gewonnenes Lebensjahr). Diskussion: Nach den Ergebnissen der eingeschlossenen Studien sind die verfügbaren HPV-Impfstoffe wirksam zur Prävention gegen durch HPV 16/18 verursachte prämaligne Läsionen der Zervix. Unklar ist derzeit noch die Dauer des Impfschutzes. Hinsichtlich der Nebenwirkungen ist die Impfung als sicher einzustufen. Allerdings ist die Fallzahl der Studien nicht ausreichend groß, um das Auftreten sehr seltener Nebenwirkungen zuverlässig zu bestimmen. Inwieweit die HPV-Impfung zur Reduktion der Inzidenz und Mortalität des Zervixkarzinoms in Deutschland führen wird, hängt nicht allein von der klinischen Wirksamkeit der Impfstoffe ab, sondern wird von einer Reihe weiterer Faktoren wie der Impfquote oder den Auswirkungen der Impfungen auf die Teilnahmerate an den bestehenden Screeningprogrammen determiniert. Infolge der Heterogenität der methodischen Rahmenbedingungen und Inputparameter variieren die Ergebnisse der gesundheitsökonomischen Modellierungen erheblich. Fast alle Modellanalysen lassen jedoch den Schluss zu, dass die Einführung einer Impfung mit lebenslanger Schutzdauer bei Fortführung der derzeitigen Screeningpraxis als kosteneffektiv zu bewerten ist. Eine Gegenüberstellung der beiden verschiedenen Impfstoffe ergab, dass die Modellierung der tetravalenten Impfung bei der Berücksichtigung von QALY als Ergebnisparameter in der Regel mit einem niedrigeren (besseren) Kosten-Effektivitäts-Verhältnis einhergeht als die Modellierung der bivalenten Impfung, da auch Genitalwarzen berücksichtigt werden. In Sensitivitätsanalysen stellten sich sowohl die Schutzdauer der Impfung als auch die Höhe der Diskontierungsrate als wesentliche Einflussparameter der Kosten-Effektivität heraus. Schlussfolgerung: Die Einführung der HPV-Impfung kann zu einem verringerten Auftreten von Zervixkarzinomen bei geimpften Frauen führen. Jedoch sollten die Impfprogramme von weiteren Evaluationen begleitet werden, um die langfristige Wirksamkeit und Sicherheit beurteilen sowie die Umsetzung der Impfprogramme optimieren zu können. Von zentraler Bedeutung sind hohe Teilnahmeraten sowohl an den Impfprogrammen als auch - auch bei geimpften Frauen - an den Früherkennungsuntersuchungen. Da die Kosten-Effektivität entscheidend von der Schutzdauer, die bislang ungewiss ist, beeinflusst wird, ist eine abschließende Beurteilung der Kosten-Effektivität der HPV-Impfung nicht möglich. Eine langfristige Schutzdauer ist eine bedeutende Vorraussetzung für die Kosten-Effektivität der Impfung. Der Abschluss einer Risk-Sharing-Vereinbarung zwischen Kostenträgern und Herstellerfirmen stellt eine Option dar, um die Auswirkungen der Unsicherheit der Schutzdauer auf die Kosten-Effektivität zu begrenzen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deep-sea hydrothermal-vent habitats are typically linear, discontinuous, and short-lived. Some of the vent fauna such as the endemic polychaete family Alvinellidae are thought to lack a planktotrophic larval stage and therefore not to broadcast-release their offspring. The genetic evidence points to exchanges on a scale that seems to contradict this type of reproductive pattern. However, the rift valley may topographically rectify the bottom currents, thereby facilitating the dispersal of propagules between active vent sites separated in some cases by 10s of kilometers or more along the ridge axis. A propagule flux model based on a matrix of intersite distances, long-term current-meter data, and information on the biology and ecology of Alvinellidae was developed to test this hypothesis. Calculations of the number of migrants exchanged between two populations per generation (N-m) allowed comparisons with estimates obtained from genetic studies. N, displays a logarithmic decrease with increasing dispersal duration and reaches the critical value of 1 after 8 d when the propagule Aux model was run in standard conditions. At most, propagule traveling time cannot reasonably exceed 15-30 d, according to the model, whereas reported distances between sites would require longer lasting dispersal abilities. Two nonexclusive explanations are proposed. First, some aspects of the biology of Alvinellidae have been overlooked and long-distance dispersal does occur. Second, such dispersal never occurs in Alvinellidae, but the spatial-temporal dynamics of vent sites over geological timescales allows short-range dispersal processes to maintain gene flow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simple model based on, the maximum energy that an athlete can produce in a small time interval is used to describe the high and long jump. Conservation of angular momentum is used to explain why an athlete should, run horizontally to perform a vertical jump. Our results agree with world records. (c) 2005 American Association of Physics Teachers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The size of online image datasets is constantly increasing. Considering an image dataset with millions of images, image retrieval becomes a seemingly intractable problem for exhaustive similarity search algorithms. Hashing methods, which encodes high-dimensional descriptors into compact binary strings, have become very popular because of their high efficiency in search and storage capacity. In the first part, we propose a multimodal retrieval method based on latent feature models. The procedure consists of a nonparametric Bayesian framework for learning underlying semantically meaningful abstract features in a multimodal dataset, a probabilistic retrieval model that allows cross-modal queries and an extension model for relevance feedback. In the second part, we focus on supervised hashing with kernels. We describe a flexible hashing procedure that treats binary codes and pairwise semantic similarity as latent and observed variables, respectively, in a probabilistic model based on Gaussian processes for binary classification. We present a scalable inference algorithm with the sparse pseudo-input Gaussian process (SPGP) model and distributed computing. In the last part, we define an incremental hashing strategy for dynamic databases where new images are added to the databases frequently. The method is based on a two-stage classification framework using binary and multi-class SVMs. The proposed method also enforces balance in binary codes by an imbalance penalty to obtain higher quality binary codes. We learn hash functions by an efficient algorithm where the NP-hard problem of finding optimal binary codes is solved via cyclic coordinate descent and SVMs are trained in a parallelized incremental manner. For modifications like adding images from an unseen class, we propose an incremental procedure for effective and efficient updates to the previous hash functions. Experiments on three large-scale image datasets demonstrate that the incremental strategy is capable of efficiently updating hash functions to the same retrieval performance as hashing from scratch.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main aim of this study was to determine the impact of innovation on productivity in service sector companies — especially those in the hospitality sector — that value the reduction of environmental impact as relevant to the innovation process. We used a structural analysis model based on the one developed by Crépon, Duguet, and Mairesse (1998). This model is known as the CDM model (an acronym of the authors’ surnames). These authors developed seminal studies in the field of the relationships between innovation and productivity (see Griliches 1979; Pakes and Grilliches 1980). The main advantage of the CDM model is its ability to integrate the process of innovation and business productivity from an empirical perspective.