903 resultados para Algorithm-oriented design
Resumo:
The purpose of this thesis was to identify the optimal design parameters for a jet nozzle which obtains a local maximum shear stress while maximizing the average shear stress on the floor of a fluid filled system. This research examined how geometric parameters of a jet nozzle, such as the nozzle's angle, height, and orifice, influence the shear stress created on the bottom surface of a tank. Simulations were run using a Computational Fluid Dynamics (CFD) software package to determine shear stress values for a parameterized geometric domain including the jet nozzle. A response surface was created based on the shear stress values obtained from 112 simulated designs. A multi-objective optimization software utilized the response surface to generate designs with the best combination of parameters to achieve maximum shear stress and maximum average shear stress. The optimal configuration of parameters achieved larger shear stress values over a commercially available design.
Resumo:
Automated information system design and implementation is one of the fastest changing aspects of the hospitality industry. During the past several years nothing has increased the professionalism or improved the productivity within the industry more than the application of computer technology. Intuitive software applications, deemed the first step toward making computers more people-literate, object-oriented programming, intended to more accurately model reality, and wireless communications are expected to play a significant role in future technological advancement.
Resumo:
Large read-only or read-write transactions with a large read set and a small write set constitute an important class of transactions used in such applications as data mining, data warehousing, statistical applications, and report generators. Such transactions are best supported with optimistic concurrency, because locking of large amounts of data for extended periods of time is not an acceptable solution. The abort rate in regular optimistic concurrency algorithms increases exponentially with the size of the transaction. The algorithm proposed in this dissertation solves this problem by using a new transaction scheduling technique that allows a large transaction to commit safely with significantly greater probability that can exceed several orders of magnitude versus regular optimistic concurrency algorithms. A performance simulation study and a formal proof of serializability and external consistency of the proposed algorithm are also presented.^ This dissertation also proposes a new query optimization technique (lazy queries). Lazy Queries is an adaptive query execution scheme which optimizes itself as the query runs. Lazy queries can be used to find an intersection of sub-queries in a very efficient way, which does not require full execution of large sub-queries nor does it require any statistical knowledge about the data.^ An efficient optimistic concurrency control algorithm used in a massively parallel B-tree with variable-length keys is introduced. B-trees with variable-length keys can be effectively used in a variety of database types. In particular, we show how such a B-tree was used in our implementation of a semantic object-oriented DBMS. The concurrency control algorithm uses semantically safe optimistic virtual "locks" that achieve very fine granularity in conflict detection. This algorithm ensures serializability and external consistency by using logical clocks and backward validation of transactional queries. A formal proof of correctness of the proposed algorithm is also presented. ^
Resumo:
Software Engineering is one of the most widely researched areas of Computer Science. The ability to reuse software, much like reuse of hardware components is one of the key issues in software development. The object-oriented programming methodology is revolutionary in that it promotes software reusability. This thesis describes the development of a tool that helps programmers to design and implement software from within the Smalltalk Environment (an Object- Oriented programming environment). The ASDN tool is part of the PEREAM (Programming Environment for the Reuse and Evolution of Abstract Models) system, which advocates incremental development of software. The Asdn tool along with the PEREAM system seeks to enhance the Smalltalk programming environment by providing facilities for structured development of abstractions (concepts). It produces a document that describes the abstractions that are developed using this tool. The features of the ASDN tool are illustrated by an example.
Resumo:
The electronics industry, is experiencing two trends one of which is the drive towards miniaturization of electronic products. The in-circuit testing predominantly used for continuity testing of printed circuit boards (PCB) can no longer meet the demands of smaller size circuits. This has lead to the development of moving probe testing equipment. Moving Probe Test opens up the opportunity to test PCBs where the test points are on a small pitch (distance between points). However, since the test uses probes that move sequentially to perform the test, the total test time is much greater than traditional in-circuit test. While significant effort has concentrated on the equipment design and development, little work has examined algorithms for efficient test sequencing. The test sequence has the greatest impact on total test time, which will determine the production cycle time of the product. Minimizing total test time is a NP-hard problem similar to the traveling salesman problem, except with two traveling salesmen that must coordinate their movements. The main goal of this thesis was to develop a heuristic algorithm to minimize the Flying Probe test time and evaluate the algorithm against a "Nearest Neighbor" algorithm. The algorithm was implemented with Visual Basic and MS Access database. The algorithm was evaluated with actual PCB test data taken from Industry. A statistical analysis with 95% C.C. was performed to test the hypothesis that the proposed algorithm finds a sequence which has a total test time less than the total test time found by the "Nearest Neighbor" approach. Findings demonstrated that the proposed heuristic algorithm reduces the total test time of the test and, therefore, production cycle time can be reduced through proper sequencing.
Resumo:
With the rapid growth of the Internet, computer attacks are increasing at a fast pace and can easily cause millions of dollar in damage to an organization. Detecting these attacks is an important issue of computer security. There are many types of attacks and they fall into four main categories, Denial of Service (DoS) attacks, Probe, User to Root (U2R) attacks, and Remote to Local (R2L) attacks. Within these categories, DoS and Probe attacks continuously show up with greater frequency in a short period of time when they attack systems. They are different from the normal traffic data and can be easily separated from normal activities. On the contrary, U2R and R2L attacks are embedded in the data portions of the packets and normally involve only a single connection. It becomes difficult to achieve satisfactory detection accuracy for detecting these two attacks. Therefore, we focus on studying the ambiguity problem between normal activities and U2R/R2L attacks. The goal is to build a detection system that can accurately and quickly detect these two attacks. In this dissertation, we design a two-phase intrusion detection approach. In the first phase, a correlation-based feature selection algorithm is proposed to advance the speed of detection. Features with poor prediction ability for the signatures of attacks and features inter-correlated with one or more other features are considered redundant. Such features are removed and only indispensable information about the original feature space remains. In the second phase, we develop an ensemble intrusion detection system to achieve accurate detection performance. The proposed method includes multiple feature selecting intrusion detectors and a data mining intrusion detector. The former ones consist of a set of detectors, and each of them uses a fuzzy clustering technique and belief theory to solve the ambiguity problem. The latter one applies data mining technique to automatically extract computer users’ normal behavior from training network traffic data. The final decision is a combination of the outputs of feature selecting and data mining detectors. The experimental results indicate that our ensemble approach not only significantly reduces the detection time but also effectively detect U2R and R2L attacks that contain degrees of ambiguous information.
Resumo:
A nuclear waste stream is the complete flow of waste material from origin to treatment facility to final disposal. The objective of this study was to design and develop a Geographic Information Systems (GIS) module using Google Application Programming Interface (API) for better visualization of nuclear waste streams that will identify and display various nuclear waste stream parameters. A proper display of parameters would enable managers at Department of Energy waste sites to visualize information for proper planning of waste transport. The study also developed an algorithm using quadratic Bézier curve to make the map more understandable and usable. Microsoft Visual Studio 2012 and Microsoft SQL Server 2012 were used for the implementation of the project. The study has shown that the combination of several technologies can successfully provide dynamic mapping functionality. Future work should explore various Google Maps API functionalities to further enhance the visualization of nuclear waste streams.
Resumo:
Major factors influencing food development and food marketing strategies in global market places at present can be attributable to the changing age structure of the population. The significant shifts in global age structure will inevitably lead to the number of people aged 60 reaching an all-time high of one billion by the year 2020. The rapidly growing population of ageing people globally represents a large, neglected and very much under-developed category within the Food Industry. The primary focus of this study was the integration of knowledge creation techniques at early NPD stages, for the development of market-oriented new health promoting foods for the ageing population. The methodology of this study was centered on an exploratory sequential mixed methods strategy. Stage one of the study involved in-depth semi-structured interviews with 16 Stakeholders to facilitate the need identification stage of the NPD process. The main outputs identified were the need for: the fortification of foods for a preventative nutrition approach, the development of foods that targeted age-related conditions such as cognitive, heart, gut and bone health, the integration of ageing compensatory packaging adaptations and the creation of marketing messages with an active lifestyle message. Stage two consisted of a market-oriented computer assisted NPD technique, a user centered design interaction (UCD) to integrate consumers as co-creators throughout the idea generation stage of the NPD process. The most important product attributes identified in this stage included: products targeted at brain and cognitive health, liquid based beverages, easy to use packaging with environmentally friendly elements, simplistic marketing with a clear focus on health not age and realistic health claims constructed with consumer friendly terminology. Finally, Stage three used an abbreviated means-end chain (MEC) analysis to complete the concept development stage of the NPD process. This stage identified commercial information that could be used by food firms for the development of positioning and communication strategies. Equally, the information generated could be of high strategic importance to governments, policy makers, health professionals and medical professionals. The values and goals listed in this stage included: better overall health, active lifestyle, optimum nutrition and wellbeing feelings. Overall, this research illustrated that knowledge creation techniques can assist firms in the development of market-oriented health promoting foods for the ageing population.
Resumo:
The effectiveness of an optimization algorithm can be reduced to its ability to navigate an objective function’s topology. Hybrid optimization algorithms combine various optimization algorithms using a single meta-heuristic so that the hybrid algorithm is more robust, computationally efficient, and/or accurate than the individual algorithms it is made of. This thesis proposes a novel meta-heuristic that uses search vectors to select the constituent algorithm that is appropriate for a given objective function. The hybrid is shown to perform competitively against several existing hybrid and non-hybrid optimization algorithms over a set of three hundred test cases. This thesis also proposes a general framework for evaluating the effectiveness of hybrid optimization algorithms. Finally, this thesis presents an improved Method of Characteristics Code with novel boundary conditions, which better characterizes pipelines than previous codes. This code is coupled with the hybrid optimization algorithm in order to optimize the operation of real-world piston pumps.
Resumo:
One of the leading motivations behind the multilingual semantic web is to make resources accessible digitally in an online global multilingual context. Consequently, it is fundamental for knowledge bases to find a way to manage multilingualism and thus be equipped with those procedures for its conceptual modelling. In this context, the goal of this paper is to discuss how common-sense knowledge and cultural knowledge are modelled in a multilingual framework. More particularly, multilingualism and conceptual modelling are dealt with from the perspective of FunGramKB, a lexico-conceptual knowledge base for natural language understanding. This project argues for a clear division between the lexical and the conceptual dimensions of knowledge. Moreover, the conceptual layer is organized into three modules, which result from a strong commitment towards capturing semantic knowledge (Ontology), procedural knowledge (Cognicon) and episodic knowledge (Onomasticon). Cultural mismatches are discussed and formally represented at the three conceptual levels of FunGramKB.
Resumo:
We consider a linear precoder design for an underlay cognitive radio multiple-input multiple-output broadcast channel, where the secondary system consisting of a secondary base-station (BS) and a group of secondary users (SUs) is allowed to share the same spectrum with the primary system. All the transceivers are equipped with multiple antennas, each of which has its own maximum power constraint. Assuming zero-forcing method to eliminate the multiuser interference, we study the sum rate maximization problem for the secondary system subject to both per-antenna power constraints at the secondary BS and the interference power constraints at the primary users. The problem of interest differs from the ones studied previously that often assumed a sum power constraint and/or single antenna employed at either both the primary and secondary receivers or the primary receivers. To develop an efficient numerical algorithm, we first invoke the rank relaxation method to transform the considered problem into a convex-concave problem based on a downlink-uplink result. We then propose a barrier interior-point method to solve the resulting saddle point problem. In particular, in each iteration of the proposed method we find the Newton step by solving a system of discrete-time Sylvester equations, which help reduce the complexity significantly, compared to the conventional method. Simulation results are provided to demonstrate fast convergence and effectiveness of the proposed algorithm.
Resumo:
VALENTIM, R. A. M. ; SOUZA NETO, Plácido Antônio de. O impacto da utilização de design patterns nas métricas e estimativas de projetos de software: a utilização de padrões tem alguma influência nas estimativas?. Revista da FARN, Natal, v. 4, p. 63-74, 2006
Resumo:
Le rapide déclin actuel de la biodiversité est inquiétant et les activités humaines en sont la cause directe. De nombreuses aires protégées ont été mises en place pour contrer cette perte de biodiversité. Afin de maximiser leur efficacité, l’amélioration de la connectivité fonctionnelle entre elles est requise. Les changements climatiques perturbent actuellement les conditions environnementales de façon globale. C’est une menace pour la biodiversité qui n’a pas souvent été intégrée lors de la mise en place des aires protégées, jusqu’à récemment. Le mouvement des espèces, et donc la connectivité fonctionnelle du paysage, est impacté par les changements climatiques et des études ont montré qu’améliorer la connectivité fonctionnelle entre les aires protégées aiderait les espèces à faire face aux impacts des changements climatiques. Ma thèse présente une méthode pour concevoir des réseaux d’aires protégées tout en tenant compte des changements climatiques et de la connectivité fonctionnelle. Mon aire d’étude est la région de la Gaspésie au Québec (Canada). La population en voie de disparition de caribou de la Gaspésie-Atlantique (Rangifer tarandus caribou) a été utilisée comme espèce focale pour définir la connectivité fonctionnelle. Cette petite population subit un déclin continu dû à la prédation et la modification de son habitat, et les changements climatiques pourraient devenir une menace supplémentaire. J’ai d’abord construit un modèle individu-centré spatialement explicite pour expliquer et simuler le mouvement du caribou. J’ai utilisé les données VHF éparses de la population de caribou et une stratégie de modélisation patron-orienté pour paramétrer et sélectionner la meilleure hypothèse de mouvement. Mon meilleur modèle a reproduit la plupart des patrons de mouvement définis avec les données observées. Ce modèle fournit une meilleure compréhension des moteurs du mouvement du caribou de la Gaspésie-Atlantique, ainsi qu’une estimation spatiale de son utilisation du paysage dans la région. J’ai conclu que les données éparses étaient suffisantes pour ajuster un modèle individu-centré lorsqu’utilisé avec une modélisation patron-orienté. Ensuite, j’ai estimé l’impact des changements climatiques et de différentes actions de conservation sur le potentiel de mouvement du caribou. J’ai utilisé le modèle individu-centré pour simuler le mouvement du caribou dans des paysages hypothétiques représentant différents scénarios de changements climatiques et d’actions de conservation. Les actions de conservation représentaient la mise en place de nouvelles aires protégées en Gaspésie, comme définies par le scénario proposé par le gouvernement du Québec, ainsi que la restauration de routes secondaires à l’intérieur des aires protégées. Les impacts des changements climatiques sur la végétation, comme définis dans mes scénarios, ont réduit le potentiel de mouvement du caribou. La restauration des routes était capable d’atténuer ces effets négatifs, contrairement à la mise en place des nouvelles aires protégées. Enfin, j’ai présenté une méthode pour concevoir des réseaux d’aires protégées efficaces et j’ai proposé des nouvelles aires protégées à mettre en place en Gaspésie afin de protéger la biodiversité sur le long terme. J’ai créé de nombreux scénarios de réseaux d’aires protégées en étendant le réseau actuel pour protéger 12% du territoire. J’ai calculé la représentativité écologique et deux mesures de connectivité fonctionnelle sur le long terme pour chaque réseau. Les mesures de connectivité fonctionnelle représentaient l’accès général aux aires protégées pour le caribou de la Gaspésie-Atlantique ainsi que son potentiel de mouvement à l’intérieur. J’ai utilisé les estimations de potentiel de mouvement pour la période de temps actuelle ainsi que pour le futur sous différents scénarios de changements climatiques pour représenter la connectivité fonctionnelle sur le long terme. Le réseau d’aires protégées que j’ai proposé était le scénario qui maximisait le compromis entre les trois caractéristiques de réseau calculées. Dans cette thèse, j’ai expliqué et prédit le mouvement du caribou de la Gaspésie-Atlantique sous différentes conditions environnementales, notamment des paysages impactés par les changements climatiques. Ces résultats m’ont aidée à définir un réseau d’aires protégées à mettre en place en Gaspésie pour protéger le caribou au cours du temps. Je crois que cette thèse apporte de nouvelles connaissances sur le comportement de mouvement du caribou de la Gaspésie-Atlantique, ainsi que sur les actions de conservation qui peuvent être prises en Gaspésie afin d’améliorer la protection du caribou et de celle d’autres espèces. Je crois que la méthode présentée peut être applicable à d’autres écosystèmes aux caractéristiques et besoins similaires.
Resumo:
Inverse heat conduction problems (IHCPs) appear in many important scientific and technological fields. Hence analysis, design, implementation and testing of inverse algorithms are also of great scientific and technological interest. The numerical simulation of 2-D and –D inverse (or even direct) problems involves a considerable amount of computation. Therefore, the investigation and exploitation of parallel properties of such algorithms are equally becoming very important. Domain decomposition (DD) methods are widely used to solve large scale engineering problems and to exploit their inherent ability for the solution of such problems.
Resumo:
Thesis (Master's)--University of Washington, 2016-06