436 resultados para ANSYS Workbench
Resumo:
As the commoditization of sensing, actuation and communication hardware increases, so does the potential for dynamically tasked sense and respond networked systems (i.e., Sensor Networks or SNs) to replace existing disjoint and inflexible special-purpose deployments (closed-circuit security video, anti-theft sensors, etc.). While various solutions have emerged to many individual SN-centric challenges (e.g., power management, communication protocols, role assignment), perhaps the largest remaining obstacle to widespread SN deployment is that those who wish to deploy, utilize, and maintain a programmable Sensor Network lack the programming and systems expertise to do so. The contributions of this thesis centers on the design, development and deployment of the SN Workbench (snBench). snBench embodies an accessible, modular programming platform coupled with a flexible and extensible run-time system that, together, support the entire life-cycle of distributed sensory services. As it is impossible to find a one-size-fits-all programming interface, this work advocates the use of tiered layers of abstraction that enable a variety of high-level, domain specific languages to be compiled to a common (thin-waist) tasking language; this common tasking language is statically verified and can be subsequently re-translated, if needed, for execution on a wide variety of hardware platforms. snBench provides: (1) a common sensory tasking language (Instruction Set Architecture) powerful enough to express complex SN services, yet simple enough to be executed by highly constrained resources with soft, real-time constraints, (2) a prototype high-level language (and corresponding compiler) to illustrate the utility of the common tasking language and the tiered programming approach in this domain, (3) an execution environment and a run-time support infrastructure that abstract a collection of heterogeneous resources into a single virtual Sensor Network, tasked via this common tasking language, and (4) novel formal methods (i.e., static analysis techniques) that verify safety properties and infer implicit resource constraints to facilitate resource allocation for new services. This thesis presents these components in detail, as well as two specific case-studies: the use of snBench to integrate physical and wireless network security, and the use of snBench as the foundation for semester-long student projects in a graduate-level Software Engineering course.
Resumo:
The main goal of a cell stability MHD model like MHD-Valdis is to help locate the busbars around the cell in a way which leads to the generation of a magnetic field inside the cell that itself leads to a stable cell operation. Yet as far as the cell stability is concerned, the uniformity of the current density in the metal pad is also extremely important and can only be achieved with a correct busbar network sizing. This work compares the usage of a detailed ANSYS based 3D thermo-electric model with the one of the versatile 1D part of MHD-Valdis to help design a well balanced busbar network.
Resumo:
eScience is an umbrella concept which covers internet technologies, such as web service orchestration that involves manipulation and processing of high volumes of data, using simple and efficient methodologies. This concept is normally associated with bioinformatics, but nothing prevents the use of an identical approach for geoinfomatics and OGC (Open Geospatial Consortium) web services like WPS (Web Processing Service). In this paper we present an extended WPS implementation based on the PyWPS framework using an automatically generated WSDL (Web Service Description Language) XML document that replicates the WPS input/output document structure used during an Execute request to a server. Services are accessed using a modified SOAP (Simple Object Access Protocol) interface provided by PyWPS, that uses service and input/outputs identifiers as element names. The WSDL XML document is dynamically generated by applying XSLT (Extensible Stylesheet Language Transformation) to the getCapabilities XML document that is generated by PyWPS. The availability of the SOAP interface and WSDL description allows WPS instances to be accessible to workflow development software like Taverna, enabling users to build complex workflows using web services represented by interconnecting graphics. Taverna will transform the visual representation of the workflow into a SCUFL (Simple Conceptual Unified Flow Language) based XML document that can be run internally or sent to a Taverna orchestration server. SCUFL uses a dataflow-centric orchestration model as opposed to the more commonly used orchestration language BPEL (Business Process Execution Language) which is process-centric.
Resumo:
Affiliation: Département de biochimie, Faculté de médecine, Université de Montréal
Resumo:
This study reports the details of the finite element analysis of eleven shear critical partially prestressed concrete T-beams having steel fibers over partial or full depth. Prestressed T-beams having a shear span to depth ratio of 2.65 and 1.59 that failed in shear have been analyzed using the ‘ANSYS’ program. The ‘ANSYS’ model accounts for the nonlinearity, such as, bond-slip of longitudinal reinforcement, postcracking tensile stiffness of the concrete, stress transfer across the cracked blocks of the concrete and load sustenance through the bridging action of steel fibers at crack interface. The concrete is modeled using ‘SOLID65’- eight-node brick element, which is capable of simulating the cracking and crushing behavior of brittle materials. The reinforcement such as deformed bars, prestressing wires and steel fibers have been modeled discretely using ‘LINK8’ – 3D spar element. The slip between the reinforcement (rebars, fibers) and the concrete has been modeled using a ‘COMBIN39’- nonlinear spring element connecting the nodes of the ‘LINK8’ element representing the reinforcement and nodes of the ‘SOLID65’ elements representing the concrete. The ‘ANSYS’ model correctly predicted the diagonal tension failure and shear compression failure of prestressed concrete beams observed in the experiment. The capability of the model to capture the critical crack regions, loads and deflections for various types of shear failures in prestressed concrete beam has been illustrated.
Resumo:
The Kineticist's Workbench is a program that simulates chemical reaction mechanisms by predicting, generating, and interpreting numerical data. Prior to simulation, it analyzes a given mechanism to predict that mechanism's behavior; it then simulates the mechanism numerically; and afterward, it interprets and summarizes the data it has generated. In performing these tasks, the Workbench uses a variety of techniques: graph- theoretic algorithms (for analyzing mechanisms), traditional numerical simulation methods, and algorithms that examine simulation results and reinterpret them in qualitative terms. The Workbench thus serves as a prototype for a new class of scientific computational tools---tools that provide symbiotic collaborations between qualitative and quantitative methods.
Resumo:
Software corpora facilitate reproducibility of analyses, however, static analysis for an entire corpus still requires considerable effort, often duplicated unnecessarily by multiple users. Moreover, most corpora are designed for single languages increasing the effort for cross-language analysis. To address these aspects we propose Pangea, an infrastructure allowing fast development of static analyses on multi-language corpora. Pangea uses language-independent meta-models stored as object model snapshots that can be directly loaded into memory and queried without any parsing overhead. To reduce the effort of performing static analyses, Pangea provides out-of-the box support for: creating and refining analyses in a dedicated environment, deploying an analysis on an entire corpus, using a runner that supports parallel execution, and exporting results in various formats. In this tool demonstration we introduce Pangea and provide several usage scenarios that illustrate how it reduces the cost of analysis.
Resumo:
CIAO is an advanced programming environment supporting Logic and Constraint programming. It offers a simple concurrent kernel on top of which declarative and non-declarative extensions are added via librarles. Librarles are available for supporting the ISOProlog standard, several constraint domains, functional and higher order programming, concurrent and distributed programming, internet programming, and others. The source language allows declaring properties of predicates via assertions, including types and modes. Such properties are checked at compile-time or at run-time. The compiler and system architecture are designed to natively support modular global analysis, with the two objectives of proving properties in assertions and performing program optimizations, including transparently exploiting parallelism in programs. The purpose of this paper is to report on recent progress made in the context of the CIAO system, with special emphasis on the capabilities of the compiler, the techniques used for supporting such capabilities, and the results in the áreas of program analysis and transformation already obtained with the system.
Resumo:
Abstract is not available.
Resumo:
Aiming to address requirements concerning integration of services in the context of ?big data?, this paper presents an innovative approach that (i) ensures a flexible, adaptable and scalable information and computation infrastructure, and (ii) exploits the competences of stakeholders and information workers to meaningfully confront information management issues such as information characterization, classification and interpretation, thus incorporating the underlying collective intelligence. Our approach pays much attention to the issues of usability and ease-of-use, not requiring any particular programming expertise from the end users. We report on a series of technical issues concerning the desired flexibility of the proposed integration framework and we provide related recommendations to developers of such solutions. Evaluation results are also discussed.
Resumo:
A 16S rRNA gene database (http://greengenes.bl.gov) addresses limitations of public repositories by providing chimera screening, standard alignment, and taxonomic classification using multiple published taxonomies. It was found that there is incongruent taxonomic nomenclature among curators even at the phylum level. Putative chimeras were identified in 3% of environmental sequences and in 0.2% of records derived from isolates. Environmental sequences were classified into 100 phylum-level lineages in the Archaea and Bacteria.
Resumo:
SQL (Structured Query Language) is one of the essential topics in foundation databases courses in higher education. Due to its apparent simple syntax, learning to use the full power of SQL can be a very difficult activity. In this paper, we introduce SQLator, which is a web-based interactive tool for learning SQL. SQLator's key function is the evaluate function, which allows a user to evaluate the correctness of his/her query formulation. The evaluate engine is based on complex heuristic algorithms. The tool also provides instructors the facility to create and populate database schemas with an associated pool of SQL queries. Currently it hosts two databases with a query pool of 300+ across the two databases. The pool is divided into 3 categories according to query complexity. The SQLator user can perform unlimited executions and evaluations on query formulations and/or view the solutions. The SQLator evaluate function has a high rate of success in evaluating the user's statement as correct (or incorrect) corresponding to the question. We will present in this paper, the basic architecture and functions of SQLator. We will further discuss the value of SQLator as an educational technology and report on educational outcomes based on studies conducted at the School of Information Technology and Electrical Engineering, The University of Queensland.
Resumo:
During the MEMORIAL project time an international consortium has developed a software solution called DDW (Digital Document Workbench). It provides a set of tools to support the process of digitisation of documents from the scanning up to the retrievable presentation of the content. The attention is focused to machine typed archival documents. One of the important features is the evaluation of quality in each step of the process. The workbench consists of automatic parts as well as of parts which request human activity. The measurable improvement of 20% shows the approach is successful.
Resumo:
This thesis investigated the risk of accidental release of hydrocarbons during transportation and storage. Transportation of hydrocarbons from an offshore platform to processing units through subsea pipelines involves risk of release due to pipeline leakage resulting from corrosion, plastic deformation caused by seabed shakedown or damaged by contact with drifting iceberg. The environmental impacts of hydrocarbon dispersion can be severe. Overall safety and economic concerns of pipeline leakage at subsea environment are immense. A large leak can be detected by employing conventional technology such as, radar, intelligent pigging or chemical tracer but in a remote location like subsea or arctic, a small chronic leak may be undetected for a period of time. In case of storage, an accidental release of hydrocarbon from the storage tank could lead pool fire; further it could escalate to domino effects. This chain of accidents may lead to extremely severe consequences. Analyzing past accident scenarios it is observed that more than half of the industrial domino accidents involved fire as a primary event, and some other factors for instance, wind speed and direction, fuel type and engulfment of the compound. In this thesis, a computational fluid dynamics (CFD) approach is taken to model the subsea pipeline leak and the pool fire from a storage tank. A commercial software package ANSYS FLUENT Workbench 15 is used to model the subsea pipeline leakage. The CFD simulation results of four different types of fluids showed that the static pressure and pressure gradient along the axial length of the pipeline have a sharp signature variation near the leak orifice at steady state condition. Transient simulation is performed to obtain the acoustic signature of the pipe near leak orifice. The power spectral density (PSD) of acoustic signal is strong near the leak orifice and it dissipates as the distance and orientation from the leak orifice increase. The high-pressure fluid flow generates more noise than the low-pressure fluid flow. In order to model the pool fire from the storage tank, ANSYS CFX Workbench 14 is used. The CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. The attempt to reduce and prevent risks is discussed based on the results obtained from the numerical simulations of the numerical models.