995 resultados para Object Modeling
Resumo:
The number of security violations is increasing and a security breach could have irreversible impacts to business. There are several ways to improve organization security, but some of them may be difficult to comprehend. This thesis demystifies threat modeling as part of secure system development. Threat modeling enables developers to reveal previously undetected security issues from computer systems. It offers a structured approach for organizations to find and address threats against vulnerabilities. When implemented correctly threat modeling will reduce the amount of defects and malicious attempts against the target environment. In this thesis Microsoft Security Development Lifecycle (SDL) is introduced as an effective methodology for reducing defects in the target system. SDL is traditionally meant to be used in software development, principles can be however partially adapted to IT-infrastructure development. Microsoft threat modeling methodology is an important part of SDL and it is utilized in this thesis to find threats from the Acme Corporation’s factory environment. Acme Corporation is used as a pseudonym for a company providing high-technology consumer electronics. Target for threat modeling is the IT-infrastructure of factory’s manufacturing execution system. Microsoft threat modeling methodology utilizes STRIDE –mnemonic and data flow diagrams to find threats. Threat modeling in this thesis returned results that were important for the organization. Acme Corporation now has more comprehensive understanding concerning IT-infrastructure of the manufacturing execution system. On top of vulnerability related results threat modeling provided coherent views of the target system. Subject matter experts from different areas can now agree upon functions and dependencies of the target system. Threat modeling was recognized as a useful activity for improving security.
Resumo:
The goal of this thesis is to define and validate a software engineering approach for the development of a distributed system for the modeling of composite materials, based on the analysis of various existing software development methods. We reviewed the main features of: (1) software engineering methodologies; (2) distributed system characteristics and their effect on software development; (3) composite materials modeling activities and the requirements for the software development. Using the design science as a research methodology, the distributed system for creating models of composite materials is created and evaluated. Empirical experiments which we conducted showed good convergence of modeled and real processes. During the study, we paid attention to the matter of complexity and importance of distributed system and a deep understanding of modern software engineering methods and tools.
Resumo:
The objective of the work is to study the flow behavior and to support the design of air cleaner by dynamic simulation.In a paper printing industry, it is necessary to monitor the quality of paper when the paper is being produced. During the production, the quality of the paper can be monitored by camera. Therefore, it is necessary to keep the camera lens clean as wood particles may fall from the paper and lie on the camera lens. In this work, the behavior of the air flow and effect of the airflow on the particles at different inlet angles are simulated. Geometries of a different inlet angles of single-channel and double-channel case were constructed using ANSYS CFD Software. All the simulations were performed in ANSYS Fluent. The simulation results of single-channel and double-channel case revealed significant differences in the behavior of the flow and the particle velocity. The main conclusion from this work are in following. 1) For the single channel case the best angle was 0 degree because in that case, the air flow can keep 60% of the particles away from the lens which would otherwise stay on lens. 2) For the double channel case, the best solution was found when the angle of the first inlet was 0 degree and the angle of second inlet was 45 degree . In that case, the airflow can keep 91% of particles away from the lens which would otherwise stay on lens.
Resumo:
This study presents an agile tool set for the business modeling in companies, entering the turbulent environment. The study’s aim is to explore business modeling techniques and their tooling by utilizing a case study of a Finnish media monitoring company, expanding to the Russian market. This work proposes a tailored “two-approach” of business modeling development that analyzes both the past and future conditions of two key factors of business modeling for companies – internal and external environments. The study explores a case company by investigating the benefits and disadvantages of firm’s present business modeling tools, developing a new tooling and applying it to the case company. Among primary data utilization, such as interviews with media monitoring industry analysts and representatives of the competing companies, and academic experts, study leans up on the comprehensive analysis of Russian media monitoring niche and its players. This study benefits the business modeling research area and combines traditional analysis tools, such as market, PESTLE and competitor analyses, in a systemic manner, with the business modeling techniques. This transformation proceeds through applying of the integrated scenario, heat map and critical design issues’ analyses in the societal, industrial and competitive context of turbulent environments. The practical outcome of this approach is the development of agile business modeling tool set, customizable by company’s requirements.
Resumo:
The perovskite crystal structure is host to many different materials from insulating to superconducting providing a diverse range of intrinsic character and complexity. A better fundamental description of these materials in terms of their electronic, optical and magnetic properties undoubtedly precedes an effective realization of their application potential. SmTiOa, a distorted perovskite has a strongly localized electronic structure and undergoes an antiferromagnetic transition at 50 K in its nominally stoichiometric form. Sr2Ru04 is a layered perovskite superconductor (ie. Tc % 1 K) bearing the same structure as the high-tem|>erature superconductor La2_xSrrCu04. Polarized reflectance measurements were carried out on both of these materials revealing several interesting features in the far-infrared range of the spectrum. In the case of SmTiOa, although insulating, evidence indicates the presence of a finite background optical conductivity. As the temperature is lowered through the ordering temperature a resonance feature appears to narrow and strengthen near 120 cm~^ A nearby phonon mode appears to also couple to this magnetic transition as revealed by a growing asymmetry in the optica] conductivity. Experiments on a doped sample with a greater itinerant character and lower Neel temperature = 40 K also indicate the presence of this strongly temperature dependent mode even at twice the ordering temperature. Although the mode appears to be sensitive to the magnetic transition it is unclear whether a magnon assignment is appropriate. At very least, evidence suggests an interesting interaction between magnetic and electronic excitations. Although Sr2Ru04 is highly anisotropic it is metallic in three-dimensions at low temperatures and reveals its coherent transport in an inter-plane Drude-like component to the highest temperatures measured (ie. 90 K). An extended Drude analysis is used to probe the frequency dependent scattering character revealing a peak in both the mass enhancement and scattering rate near 80 cm~* and 100 cm~* respectively. All of these experimental observations appear relatively consistent with a Fermi-liquid picture of charge transport. To supplement the optical measurements a resistivity station was set up with an event driven object oriented user interface. The program controls a Keithley Current Source, HP Nano-Voltmeter and Switching Unit as well as a LakeShore Temperature Controller in order to obtain a plot of the Resistivity as a function of temperature. The system allows for resistivity measurements ranging from 4 K to 290 K using an external probe or between 0.4 K to 295 K using a Helium - 3 Cryostat. Several materials of known resistivity have confirmed the system to be robust and capable of measuring metallic samples distinguishing features of several fiQ-cm.
Resumo:
This thesis will introduce a new strongly typed programming language utilizing Self types, named Win--*Foy, along with a suitable user interface designed specifically to highlight language features. The need for such a programming language is based on deficiencies found in programming languages that support both Self types and subtyping. Subtyping is a concept that is taken for granted by most software engineers programming in object-oriented languages. Subtyping supports subsumption but it does not support the inheritance of binary methods. Binary methods contain an argument of type Self, the same type as the object itself, in a contravariant position, i.e. as a parameter. There are several arguments in favour of introducing Self types into a programming language (11. This rationale led to the development of a relation that has become known as matching [4, 5). The matching relation does not support subsumption, however, it does support the inheritance of binary methods. Two forms of matching have been proposed (lJ. Specifically, these relations are known as higher-order matching and I-bound matching. Previous research on these relations indicates that the higher-order matching relation is both reflexive and transitive whereas the f-bound matching is reflexive but not transitive (7]. The higher-order matching relation provides significant flexibility regarding inheritance of methods that utilize or return values of the same type. This flexibility, in certain situations, can restrict the programmer from defining specific classes and methods which are based on constant values [21J. For this reason, the type This is used as a second reference to the type of the object that cannot, contrary to Self, be specialized in subclasses. F-bound matching allows a programmer to define a function that will work for all types of A', a subtype of an upper bound function of type A, with the result type being dependent on A'. The use of parametric polymorphism in f-bound matching provides a connection to subtyping in object-oriented languages. This thesis will contain two main sections. Firstly, significant details concerning deficiencies of the subtype relation and the need to introduce higher-order and f-bound matching relations into programming languages will be explored. Secondly, a new programming language named Win--*Foy Functional Object-Oriented Programming Language has been created, along with a suitable user interface, in order to facilitate experimentation by programmers regarding the matching relation. The construction of the programming language and the user interface will be explained in detail.
Resumo:
Formal verification of software can be an enormous task. This fact brought some software engineers to claim that formal verification is not feasible in practice. One possible method of supporting the verification process is a programming language that provides powerful abstraction mechanisms combined with intensive reuse of code. In this thesis we present a strongly typed functional object-oriented programming language. This language features type operators of arbitrary kind corresponding to so-called type protocols. Sub classing and inheritance is based on higher-order matching, i.e., utilizes type protocols as basic tool for reuse of code. We define the operational and axiomatic semantics of this language formally. The latter is the basis of the interactive proof assistant VOOP (Verified Object-Oriented Programs) that allows the user to prove equational properties of programs interactively.
Resumo:
Genetic Programming (GP) is a widely used methodology for solving various computational problems. GP's problem solving ability is usually hindered by its long execution times. In this thesis, GP is applied toward real-time computer vision. In particular, object classification and tracking using a parallel GP system is discussed. First, a study of suitable GP languages for object classification is presented. Two main GP approaches for visual pattern classification, namely the block-classifiers and the pixel-classifiers, were studied. Results showed that the pixel-classifiers generally performed better. Using these results, a suitable language was selected for the real-time implementation. Synthetic video data was used in the experiments. The goal of the experiments was to evolve a unique classifier for each texture pattern that existed in the video. The experiments revealed that the system was capable of correctly tracking the textures in the video. The performance of the system was on-par with real-time requirements.
Resumo:
Experimental Extended X-ray Absorption Fine Structure (EXAFS) spectra carry information about the chemical structure of metal protein complexes. However, pre- dicting the structure of such complexes from EXAFS spectra is not a simple task. Currently methods such as Monte Carlo optimization or simulated annealing are used in structure refinement of EXAFS. These methods have proven somewhat successful in structure refinement but have not been successful in finding the global minima. Multiple population based algorithms, including a genetic algorithm, a restarting ge- netic algorithm, differential evolution, and particle swarm optimization, are studied for their effectiveness in structure refinement of EXAFS. The oxygen-evolving com- plex in S1 is used as a benchmark for comparing the algorithms. These algorithms were successful in finding new atomic structures that produced improved calculated EXAFS spectra over atomic structures previously found.
Object-Oriented Genetic Programming for the Automatic Inference of Graph Models for Complex Networks
Resumo:
Complex networks are systems of entities that are interconnected through meaningful relationships. The result of the relations between entities forms a structure that has a statistical complexity that is not formed by random chance. In the study of complex networks, many graph models have been proposed to model the behaviours observed. However, constructing graph models manually is tedious and problematic. Many of the models proposed in the literature have been cited as having inaccuracies with respect to the complex networks they represent. However, recently, an approach that automates the inference of graph models was proposed by Bailey [10] The proposed methodology employs genetic programming (GP) to produce graph models that approximate various properties of an exemplary graph of a targeted complex network. However, there is a great deal already known about complex networks, in general, and often specific knowledge is held about the network being modelled. The knowledge, albeit incomplete, is important in constructing a graph model. However it is difficult to incorporate such knowledge using existing GP techniques. Thus, this thesis proposes a novel GP system which can incorporate incomplete expert knowledge that assists in the evolution of a graph model. Inspired by existing graph models, an abstract graph model was developed to serve as an embryo for inferring graph models of some complex networks. The GP system and abstract model were used to reproduce well-known graph models. The results indicated that the system was able to evolve models that produced networks that had structural similarities to the networks generated by the respective target models.
Resumo:
This paper develops a model of short-range ballistic missile defense and uses it to study the performance of Israel’s Iron Dome system. The deterministic base model allows for inaccurate missiles, unsuccessful interceptions, and civil defense. Model enhancements consider the trade-offs in attacking the interception system, the difficulties faced by militants in assembling large salvos, and the effects of imperfect missile classification by the defender. A stochastic model is also developed. Analysis shows that system performance can be highly sensitive to the missile salvo size, and that systems with higher interception rates are more “fragile” when overloaded. The model is calibrated using publically available data about Iron Dome’s use during Operation Pillar of Defense in November 2012. If the systems performed as claimed, they saved Israel an estimated 1778 casualties and $80 million in property damage, and thereby made preemptive strikes on Gaza about 8 times less valuable to Israel. Gaza militants could have inflicted far more damage by grouping their rockets into large salvos, but this may have been difficult given Israel’s suppression efforts. Counter-battery fire by the militants is unlikely to be worthwhile unless they can obtain much more accurate missiles.
Resumo:
In this paper, we introduce a new approach for volatility modeling in discrete and continuous time. We follow the stochastic volatility literature by assuming that the variance is a function of a state variable. However, instead of assuming that the loading function is ad hoc (e.g., exponential or affine), we assume that it is a linear combination of the eigenfunctions of the conditional expectation (resp. infinitesimal generator) operator associated to the state variable in discrete (resp. continuous) time. Special examples are the popular log-normal and square-root models where the eigenfunctions are the Hermite and Laguerre polynomials respectively. The eigenfunction approach has at least six advantages: i) it is general since any square integrable function may be written as a linear combination of the eigenfunctions; ii) the orthogonality of the eigenfunctions leads to the traditional interpretations of the linear principal components analysis; iii) the implied dynamics of the variance and squared return processes are ARMA and, hence, simple for forecasting and inference purposes; (iv) more importantly, this generates fat tails for the variance and returns processes; v) in contrast to popular models, the variance of the variance is a flexible function of the variance; vi) these models are closed under temporal aggregation.
Resumo:
Affiliation: Institut de recherche en immunologie et en cancérologie, Université de Montréal