952 resultados para diagram
Resumo:
In the theory part the membrane emulsification was studied. Emulsions are used in many industrial areas. Traditionally emulsions are prepared by using high shear in rotor-stator systems or in high pressure homogenizer systems. In membrane emulsification two immiscible liquids are mixed by pressuring one liquid through the membrane into the other liquid. With this technique energy could be saved, more homogeneous droplets could be formed and the amount of surfactant could be decreased. Ziegler-Natta and single-site catalysts are used in olefin polymerization processes. Nowadays, these catalysts are prepared according to traditional mixing emulsification. More homogeneous catalyst particles that have narrower particle size distribution might be prepared with membrane emulsification. The aim of the experimental part was to examine the possibility to prepare single site polypropylene catalyst using membrane emulsification technique. Different membrane materials and solidification techniques of the emulsion were examined. Also the toluene-PFC phase diagram was successfully measured during this thesis work. This phase diagram was used for process optimization. The polytetrafluoroethylene membranes had the largest contact angles with toluene and also the biggest difference between the contact angles measured with PFC and toluene. Despite of the contact angle measurement results no significant difference was noticed between particles prepared using PTFE membrane or metal sinter. The particle size distributions of catalyst prepared in these tests were quite wide. This would probably be fixed by using a membrane with a more homogeneous pore size distribution. It is also possible that the solidification rate has an effect on the particle sizes and particle morphology. When polymeric membranes are compared PTFE is probably still the best material for the process as it had the best chemical durability.
Resumo:
The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.
Resumo:
Tässä työssä tutkittiin Ruukin valmistamista lujista Optim 700 MC Plus sekä Optim 700 QL teräksistä tehtyjen hitsattujen palkkirakenteiden vikasietoisuutta sekä murtumiskäyttäytymistä laboratoriossa suoritettujen täyden mittakaavan kokeiden avulla. Koerakenteet suunniteltiin siten, että rakenteen kylmämuovauksen sekä hitsauksen vaikutukset yhdessä rakenteen geometrian vaikutuksen kanssa heikentävät rakenteen murtumissitkeyttä rakenteeseen tehdyn teräväkärkisen alkusärön tasossa. Suunnitellussa koesarjassa varioidaan testauslämpötilan lisäksi hitsauksen lämmöntuontia sekä koerakenteeseen tehtävän alkusärön kokoa. Tässä työssä tavoitteena oli yleisesti esittää menettely hitsatun säröllisen teräsrakenteen kestävyyden arviointia varten. Optim 700 MC Plus teräksestä tehdyn koerakenteen käyttäytymistä tutkittiin laskennallisesti murtumismekaniikan avulla. Laadittujen FEM - mallien avulla laskettiin rakenteen murtumisparametrien arvot kummallekin tutkittavalle teräkselle. Optim 700 MC Plus materiaalista valmistetuille koerakenteen säröalueen rakennedetaljia vastaavalle koekappaleille suoritettiin murtumissitkeyskokeita. Murtumissitkeyskokeista saatujen tulosten avulla pystyttiin kuvaamaan täyden mittakaavan koerakenteessa olevan särön murtumiskäyttäytymistä mitoitusmenetelmissä. Koerakenteelle laskettiin tässä työssä kriittisen särökoon ja sitä vastaavan kuorman arvot perustuen rakenteen oletettuun hauraaseen, epästabiiliin sitkeään sekä plastiseen murtumiskäyttäytymiseen. Tässä työssä testattiin molemmista tutkittavista materiaaleista valmistetut täyden mittakaavan koerakenteet -40 °C lämpötilassa. Molemmat testatut rakenteet käyttäytyvät mitattujen siirtymätulosten perusteella melko hauraasti. Optim 700 MC Plus materiaalille saatujen laskentatulosten voidaan todeta testatun koekappaleen perusteella soveltuvan hauraasti käyttäytyvän rakenteen mitoitukseen.
Resumo:
Työn tavoitteena oli luoda työkalu kestomagneettikoneiden roottoreiden väsymisen analysointia varten. Työkalu toteutettiin siten, että siihen voidaan liittää oikeasta koneesta mitattu kuormitusdata, sekä tarvittavat materiaalitiedot. Kuormitusdata muunnetaan työkalussa jännityshistoriaksi käyttämällä elementtimenetelmän avulla laskettavaa skaalauskerrointa. Kestoiän laskemiseen analyysityökalu käyttää jännitykseen perustuvaa menetelmää sekä rainflowmenetelmää ja Palmgren-Minerin kumulatiivista vauriosääntöä. Lisäksi työkalu tekee tutkittavalle tapaukselle Smithin väsymislujuuspiirroksen. Edellä mainittujen menetelmien lisäksi työn teoriaosassa esiteltiin väsymisanalyysimenetelmistä myös paikalliseen venymään perustuva menetelmä sekä murtumismekaniikka. Nämä menetelmät jäivät monimutkaisuutensa vuoksi toteuttamatta työkalussa. Väsymisanalyysityökalulla laskettiin kestoiät kahdelle esimerkkitapaukselle. Kummassakin tapauksessa saatiin tulokseksi ääretön kestoikä, mutta aksiaalivuokoneen roottorin dynaaminen varmuus oli pieni. Vaikka tulokset vaikuttavat järkeviltä, ne olisi vielä hyvä verifioida esimerkiksi kaupallisen ohjelmiston avulla täyden varmuuden saamiseksi.
Resumo:
The main focus of this thesis is to define the field weakening point of permanent magnet synchronous machine with embedded magnets in traction applications. Along with the thesis a modelling program is made to help the designer to define the field weakening point in practical applications. The thesis utilizes the equations based on the current angle. These equations can be derived from the vector diagram of permanent magnet synchronous machine. The design parameters of the machine are: The maximum rotational speed, saliency ratio, maximum induced voltage and characteristic current. The main result of the thesis is finding out the rated rotational speed, from which the field weakening starts. The action of the machine is estimated at a wide speed range and the changes of machine parameters are examined.
Resumo:
Lämmöntuonnilla on oleellinen vaikutus hitsausliitoksen ominaisuuksiin, koska se vaikuttaa liitoksen jäähtymisnopeuteen, jolla on puolestaan suuri vaikutus jäähtymisessä syntyviin mikrorakenteisiin. Jatkuvan jäähtymisen S-käyrältä voidaan ennustaa hitsausliitokseen syntyvät mikrorakenteet. S-käyrät voidaan laatia hitsausolosuhteiden mukaisesti, jolloin faasimuutoskäyttäytyminen sularajalla saadaan selvitettyä. Tämän diplomityön tavoitteena oli kehittää hitsausvirtalähteen ohjaustapaa lämmöntuontiin ja jatkuvan jäähtymisen S-käyriin perustuen. Jatkuvan jäähtymisen S-käyrillä ja lämmöntuontiin perustuvalla hitsausparametrien säädöllä on yhteys. Työssä tutkittiin, miten haluttuun jäähtymisnopeuteen johtava lämmöntuonti voidaan määrittää S-käyrälle luotettavasti. Työssä perehdyttiin jatkuvan jäähtymisen S-käyriin ja eri jäähtymisnopeuksilla hitsausliitokseen syntyviin mikrorakenteisiin sekä hitsaus-inverttereiden ohjaus- ja säätötekniikkaan. Teoriaosuuden jälkeen tarkasteltiin eri vaihtoehtoja, miten hitsattavan materiaalin koostumusvaihtelut sekä lämmöntuontiin vaikuttavat tekijät voidaan ottaa huomioon virtalähteen ohjauksessa lämmöntuonnin perusteella. S-käyrältä määritettyjen lämmöntuonnin arvojen perusteella tehtiin kahdet koehitsaukset, joissa käytettiin kolmea eri aineenpaksuutta. Tulosten perusteella arvioitiin lämmöntuonnin arvojen toimivuutta käytännössä ja tutkittiin liitokseen syntyviä mikrorakenteita. Tutkimuksen pohjalta esitettiin jatkokehitystoimenpiteitä, joiden mukaan voidaan edetä lämmöntuontiin perustuvan säätöjärjestelmän kehitysprojektissa.
Resumo:
The purpose of this research is to draw up a clear construction of an anticipatory communicative decision-making process and a successful implementation of a Bayesian application that can be used as an anticipatory communicative decision-making support system. This study is a decision-oriented and constructive research project, and it includes examples of simulated situations. As a basis for further methodological discussion about different approaches to management research, in this research, a decision-oriented approach is used, which is based on mathematics and logic, and it is intended to develop problem solving methods. The approach is theoretical and characteristic of normative management science research. Also, the approach of this study is constructive. An essential part of the constructive approach is to tie the problem to its solution with theoretical knowledge. Firstly, the basic definitions and behaviours of an anticipatory management and managerial communication are provided. These descriptions include discussions of the research environment and formed management processes. These issues define and explain the background to further research. Secondly, it is processed to managerial communication and anticipatory decision-making based on preparation, problem solution, and solution search, which are also related to risk management analysis. After that, a solution to the decision-making support application is formed, using four different Bayesian methods, as follows: the Bayesian network, the influence diagram, the qualitative probabilistic network, and the time critical dynamic network. The purpose of the discussion is not to discuss different theories but to explain the theories which are being implemented. Finally, an application of Bayesian networks to the research problem is presented. The usefulness of the prepared model in examining a problem and the represented results of research is shown. The theoretical contribution includes definitions and a model of anticipatory decision-making. The main theoretical contribution of this study has been to develop a process for anticipatory decision-making that includes management with communication, problem-solving, and the improvement of knowledge. The practical contribution includes a Bayesian Decision Support Model, which is based on Bayesian influenced diagrams. The main contributions of this research are two developed processes, one for anticipatory decision-making, and the other to produce a model of a Bayesian network for anticipatory decision-making. In summary, this research contributes to decision-making support by being one of the few publicly available academic descriptions of the anticipatory decision support system, by representing a Bayesian model that is grounded on firm theoretical discussion, by publishing algorithms suitable for decision-making support, and by defining the idea of anticipatory decision-making for a parallel version. Finally, according to the results of research, an analysis of anticipatory management for planned decision-making is presented, which is based on observation of environment, analysis of weak signals, and alternatives to creative problem solving and communication.
Resumo:
Fraud is an increasing phenomenon as shown in many surveys carried out by leading international consulting companies in the last years. Despite the evolution of electronic payments and hacking techniques there is still a strong human component in fraud schemes. Conflict of interest in particular is the main contributing factor to the success of internal fraud. In such cases anomaly detection tools are not always the best instruments, since the fraud schemes are based on faking documents in a context dominated by lack of controls, and the perpetrators are those ones who should control possible irregularities. In the banking sector audit team experts can count only on their experience, whistle blowing and the reports sent by their inspectors. The Fraud Interactive Decision Expert System (FIDES), which is the core of this research, is a multi-agent system built to support auditors in evaluating suspicious behaviours and to speed up the evaluation process in order to detect or prevent fraud schemes. The system combines Think-map, Delphi method and Attack trees and it has been built around audit team experts and their needs. The output of FIDES is an attack tree, a tree-based diagram to ”systematically categorize the different ways in which a system can be attacked”. Once the attack tree is built, auditors can choose the path they perceive as more suitable and decide whether or not to start the investigation. The system is meant for use in the future to retrieve old cases in order to match them with new ones and find similarities. The retrieving features of the system will be useful to simplify the risk management phase, since similar countermeasures adopted for past cases might be useful for present ones. Even though FIDES has been built with the banking sector in mind, it can be applied in all those organisations, like insurance companies or public organizations, where anti-fraud activity is based on a central anti-fraud unit and a reporting system.
Resumo:
Tässä diplomityössä on tarkasteltu Porvoon öljynjalostamon vetyverkkoa ja pohdittu keinoja, joilla vedyn käyttöä jalostamolla voitaisiin tehostaan sekä polttokaasuverkkoon menevän vedyn määrä pienentää. Tarkastelun lähtökohtana toimii vetytaseen pohjalta laadittu vetypinch-analyysi. Kirjallisuusosassa on esitelty jalostamon vetyverkkoon kuuluvat yksiköt sekä käsitelty lyhyesti niiden toimintaa. Lisäksi on käsitelty vetypinch-analyysin periaate, sekä kuinka todelliset prosessirajoitteet voidaan huomioida sitä toteutettaessa. Kirjallisuusosan lopussa on esitetty miten vetyverkon vaiheittainen optimointi etenee. Työn soveltavassa osassa laadittiin vetyverkon virtauskaavio, jolla saatiin luotua kattava käsitys jalostamon vedynjakelusta. Virtauskaaviosta tehtiin yksinkertaistettu versio, jonka perusteella laadittiin vetytase. Vetytaseen pohjalta suoritettiin vetypinch-analyysi, jonka mukaan jalostamolla tuotettiin tasehetkellä ylimäärin vetyä. Vedyn käytön tehostamiseksi jalostamolla tulee rikkivedyn talteenottoyksikkö 2:n polttokaasuvirta pyrkiä minimoimaan tai hyödyntämään. Lisäksi virtausmittareiden mitoituspisteiden molekyylimassat tulisi muuttaa vastaamaan paremmin nykyistä ajotilannetta, sekä seurata niitä jatkossa säännöllisesti. Myös vetypitoisuutta mittaavien online-analysaattoreiden kalibroinnista tulee huolehtia, ja ottaa riittävästi kenttänäytteitä vetyverkosta. On huomattava, että öljynjalostamon vedyn tuotannon minimointi ei ole aina automaattisesti taloudellisin ratkaisu. Joissain tapauksissa vedyn osapaineen nostaminen vetyä kuluttavan yksikön reaktorissa voi lisätä yksikön tuottavuutta niin paljon, että se kompensoi lisääntyneestä vedyn tuotannosta aiheutuvat kustannukset.
Resumo:
It is known already from 1970´s that laser beam is suitable for processing paper materials. In this thesis, term paper materials mean all wood-fibre based materials, like dried pulp, copy paper, newspaper, cardboard, corrugated board, tissue paper etc. Accordingly, laser processing in this thesis means all laser treatments resulting material removal, like cutting, partial cutting, marking, creasing, perforation etc. that can be used to process paper materials. Laser technology provides many advantages for processing of paper materials: non-contact method, freedom of processing geometry, reliable technology for non-stop production etc. Especially packaging industry is very promising area for laser processing applications. However, there are only few industrial laser processing applications worldwide even in beginning of 2010´s. One reason for small-scale use of lasers in paper material manufacturing is that there is a shortage of published research and scientific articles. Another problem, restraining the use of laser for processing of paper materials, is colouration of paper material i.e. the yellowish and/or greyish colour of cut edge appearing during cutting or after cutting. These are the main reasons for selecting the topic of this thesis to concern characterization of interaction of laser beam and paper materials. This study was carried out in Laboratory of Laser Processing at Lappeenranta University of Technology (Finland). Laser equipment used in this study was TRUMPF TLF 2700 carbon dioxide laser that produces a beam with wavelength of 10.6 μm with power range of 190-2500 W (laser power on work piece). Study of laser beam and paper material interaction was carried out by treating dried kraft pulp (grammage of 67 g m-2) with different laser power levels, focal plane postion settings and interaction times. Interaction between laser beam and dried kraft pulp was detected with different monitoring devices, i.e. spectrometer, pyrometer and active illumination imaging system. This way it was possible to create an input and output parameter diagram and to study the effects of input and output parameters in this thesis. When interaction phenomena are understood also process development can be carried out and even new innovations developed. Fulfilling the lack of information on interaction phenomena can assist in the way of lasers for wider use of technology in paper making and converting industry. It was concluded in this thesis that interaction of laser beam and paper material has two mechanisms that are dependent on focal plane position range. Assumed interaction mechanism B appears in range of average focal plane position of 3.4 mm and 2.4 mm and assumed interaction mechanism A in range of average focal plane position of 0.4 mm and -0.6 mm both in used experimental set up. Focal plane position 1.4 mm represents midzone of these two mechanisms. Holes during laser beam and paper material interaction are formed gradually: first small hole is formed to interaction area in the centre of laser beam cross-section and after that, as function of interaction time, hole expands, until interaction between laser beam and dried kraft pulp is ended. By the image analysis it can be seen that in beginning of laser beam and dried kraft pulp material interaction small holes off very good quality are formed. It is obvious that black colour and heat affected zone appear as function of interaction time. This reveals that there still are different interaction phases within interaction mechanisms A and B. These interaction phases appear as function of time and also as function of peak intensity of laser beam. Limit peak intensity is the value that divides interaction mechanism A and B from one-phase interaction into dual-phase interaction. So all peak intensity values under limit peak intensity belong to MAOM (interaction mechanism A one-phase mode) or to MBOM (interaction mechanism B onephase mode) and values over that belong to MADM (interaction mechanism A dual-phase mode) or to MBDM (interaction mechanism B dual-phase mode). Decomposition process of cellulose is evolution of hydrocarbons when temperature is between 380- 500°C. This means that long cellulose molecule is split into smaller volatile hydrocarbons in this temperature range. As temperature increases, decomposition process of cellulose molecule changes. In range of 700-900°C, cellulose molecule is mainly decomposed into H2 gas; this is why this range is called evolution of hydrogen. Interaction in this range starts (as in range of MAOM and MBOM), when a small good quality hole is formed. This is due to “direct evaporation” of pulp via decomposition process of evolution of hydrogen. And this can be seen can be seen in spectrometer as high intensity peak of yellow light (in range of 588-589 nm) which refers to temperature of ~1750ºC. Pyrometer does not detect this high intensity peak since it is not able to detect physical phase change from solid kraft pulp to gaseous compounds. As interaction time between laser beam and dried kraft pulp continues, hypothesis is that three auto ignition processes occurs. Auto ignition of substance is the lowest temperature in which it will spontaneously ignite in a normal atmosphere without an external source of ignition, such as a flame or spark. Three auto ignition processes appears in range of MADM and MBDM, namely: 1. temperature of auto ignition of hydrogen atom (H2) is 500ºC, 2. temperature of auto ignition of carbon monoxide molecule (CO) is 609ºC and 3. temperature of auto ignition of carbon atom (C) is 700ºC. These three auto ignition processes leads to formation of plasma plume which has strong emission of radiation in range of visible light. Formation of this plasma plume can be seen as increase of intensity in wavelength range of ~475-652 nm. Pyrometer shows maximum temperature just after this ignition. This plasma plume is assumed to scatter laser beam so that it interacts with larger area of dried kraft pulp than what is actual area of beam cross-section. This assumed scattering reduces also peak intensity. So result shows that assumably scattered light with low peak intensity is interacting with large area of hole edges and due to low peak intensity this interaction happens in low temperature. So interaction between laser beam and dried kraft pulp turns from evolution of hydrogen to evolution of hydrocarbons. This leads to black colour of hole edges.
Resumo:
The nonlinear interaction between Görtler vortices (GV) and three-dimensional Tollmien-Schlichting (TS) waves nonlinear interaction is studied with a spatial, nonparallel model based on the Parabolized Stability Equations (PSE). In this investigation the effect of TS wave frequency on the nonlinear interaction is studied. As verified in previous investigations using the same numerical model, the relative amplitudes and growth rates are the dominant parameters in GV/TS wave interaction. In this sense, the wave frequency influence is important in defining the streamwise distance traveled by the disturbances in the unstable region of the stability diagram and in defining the amplification rates that they go through.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
The present study examined the floristic composition of three fragments of Araucaria Forest (AF) in the Planalto Catarinense region of southern Brazil as well as the floristic contextualization of these areas in relation to other remnant AF sites. Three AF fragments at different altitudes were analyzed in the municipalities of Campos Novos, Lages, and Painel. Fifty 200 m² plots were examined in each fragment and all of the trees with CBH (circumference at breast height) > 15.7 cm were identified. In order to floristically contextualize the study fragments, comparisons were made with other remnant AF sites by way of dendrograms and NMDS (Non-metric multidimensional scaling). Environmental and spatial variables were plotted on the diagram produced by the NMDS to evaluate their influence on the floristic patterns encountered. The forest fragments studied demonstrated high floristic heterogeneity, indicating that AFs cannot be considered homogeneous formations and they could be classified into 3 phytogeographical categories: i) high altitude areas influenced by cloud cover/fog, including the Painel region; ii) areas of lesser altitude and greater mean annual temperatures situated in the Paraná River basin, and iii) areas situated in the Paraná and Upper-Uruguay river basins and the smaller basins draining directly into the southern Atlantic, near Campos Novos and Lages. The environmental variables most highly correlated with species substitutions among the sites were altitude, mean annual temperature, and the mean temperature of the most humid trimester.
Resumo:
Water and saline intake is controlled by several mechanisms activated during dehydration. Some mechanisms, such as the production of angiotensin II and unloading of cardiovascular receptors, activate both behaviors, while others, such as the increase in blood osmolality or sodium concentration, activate water, but inhibit saline intake. Aldosterone probably activates only saline intake. Clonidine, an a2-adrenergic agonist, inhibits water and saline intake induced by these mechanisms. One model to describe the interactions between these multiple mechanisms is a wire-block diagram, where the brain circuit that controls each intake is represented by a summing point of its respective inhibiting and activating factors. The a2-adrenoceptors constitute an inhibitory factor common to both summing points
Resumo:
It is well known that the interaction of polyelectrolytes with oppositely charged surfactants leads to an associative phase separation; however, the phase behavior of DNA and oppositely charged surfactants is more strongly associative than observed in other systems. A precipitate is formed with very low amounts of surfactant and DNA. DNA compaction is a general phenomenon in the presence of multivalent ions and positively charged surfaces; because of the high charge density there are strong attractive ion correlation effects. Techniques like phase diagram determinations, fluorescence microscopy, and ellipsometry were used to study these systems. The interaction between DNA and catanionic mixtures (i.e., mixtures of cationic and anionic surfactants) was also investigated. We observed that DNA compacts and adsorbs onto the surface of positively charged vesicles, and that the addition of an anionic surfactant can release DNA back into solution from a compact globular complex between DNA and the cationic surfactant. Finally, DNA interactions with polycations, chitosans with different chain lengths, were studied by fluorescence microscopy, in vivo transfection assays and cryogenic transmission electron microscopy. The general conclusion is that a chitosan effective in promoting compaction is also efficient in transfection.