999 resultados para Process algebras
Resumo:
Graphene films were produced by chemical vapor deposition (CVD) of pyridine on copper substrates. Pyridine-CVD is expected to lead to doped graphene by the insertion of nitrogen atoms in the growing sp2 carbon lattice, possibly improving the properties of graphene as a transparent conductive film. We here report on the influence that the CVD parameters (i.e., temperature and gas flow) have on the morphology, transmittance, and electrical conductivity of the graphene films grown with pyridine. A temperature range between 930 and 1070 °C was explored and the results were compared to those of pristine graphene grown by ethanol-CVD under the same process conditions. The films were characterized by atomic force microscopy, Raman and X-ray photoemission spectroscopy. The optical transmittance and electrical conductivity of the films were measured to evaluate their performance as transparent conductive electrodes. Graphene films grown by pyridine reached an electrical conductivity of 14.3 × 105 S/m. Such a high conductivity seems to be associated with the electronic doping induced by substitutional nitrogen atoms. In particular, at 930 °C the nitrogen/carbon ratio of pyridine-grown graphene reaches 3%, and its electrical conductivity is 40% higher than that of pristine graphene grown from ethanol-CVD.
Resumo:
Modeling and analysis of wave propagation in elastic solids undergoing damage and growth process are reported in this paper. Two types of diagnostic problems, (1) the propagation of waves in the presence of a slow growth process and (2) the propagation of waves in the presence of a fast growth process, are considered. The proposed model employs a slow and a fast time scale and a homogenization technique in the wavelength scale. A detailed analysis of wave dispersion is carried out. A spectral analysis reveals certain low-frequency bands, where the interaction between the wave and the growth process produces acoustic metamaterial-like behavior. Various practical issues in designing an efficient method of acousto-ultrasonic wave based diagnostics of the growth process are discussed. Diagnostics of isotropic damage in a ductile or quasi-brittle solid by using a micro-second pulsating signal is considered for computer simulations, which is to illustrate the practical application of the proposed modeling and analysis. The simulated results explain how an estimate of signal spreading can be effectively employed to detect the presence of a steady-state damage or the saturation of a process.
Resumo:
Process view technology is catching more attentions in modern business process management, as it enables the customisation of business process representation. This capability helps improve the privacy protection, authority control, flexible display, etc., in business process modelling. One of approaches to generate process views is to allow users to construct an aggregate on their underlying processes. However, most aggregation approaches stick to a strong assumption that business processes are always well-structured, which is over strict to BPMN. Aiming to build process views for non-well-structured BPMN processes, this paper investigates the characteristics of BPMN structures, tasks, events, gateways, etc., and proposes a formal process view aggregation approach to facilitate BPMN process view creation. A set of consistency rules and construction rules are defined to regulate the aggregation and guarantee the order preservation, structural and behaviour correctness and a novel aggregation technique, called EP-Fragment, is developed to tackle non-well-structured BPMN processes.
Resumo:
The process view concept deploys a partial and temporal representation to adjust the visible view of a business process according to various perception constraints of users. Process view technology is of practical use for privacy protection and authorization control in process-oriented business management. Owing to complex organizational structure, it is challenging for large companies to accurately specify the diverse perception of different users over business processes. Aiming to tackle this issue, this article presents a role-based process view model to incorporate role dependencies into process view derivation. Compared to existing process view approaches, ours particularly supports runtime updates to the process view perceivable to a user with specific view merging operations, thereby enabling the dynamic tracing of process perception. A series of rules and theorems are established to guarantee the structural consistency and validity of process view transformation. A hypothetical case is conducted to illustrate the feasibility of our approach, and a prototype is developed for the proof-of-concept purpose.
Resumo:
The precipitation processes in dilute nitrogen alloys of titanium have been examined in detail by conventional transmission electron microscopy (CTEM) and high-resolution electron microscopy (HREM). The alloy Ti-2 at. pct N on quenching from its high-temperature beta phase field has been found to undergo early stages of decomposition. The supersaturated solid solution (alpha''-hcp) on decomposition gives rise to an intimately mixed, irresolvable product microstructure. The associated strong tweed contrast presents difficulties in understanding the characteristic features of the process. Therefore, HREM has been carried out with a view to getting a clear picture of the decomposition process. Studies on the quenched samples of the alloy suggest the formation of solute-rich zones of a few atom layers thick, randomly distributed throughout the matrix. On aging, these zones grow to a size beyond which the precipitate/matrix interfaces appear to become incoherent and the alpha' (tetragonal) product phase is seen distinctly. The structural details, the crystallography of the precipitation process, and the sequence of precipitation reaction in the system are illustrated.
Resumo:
There are essentially two different phenomenological models available to describe the interdiffusion process in binary systems in the olid state. The first of these, which is used more frequently, is based on the theory of flux partitioning. The second model, developed much more recently, uses the theory of dissociation and reaction. Although the theory of flux partitioning has been widely used, we found that this theory does not account for the mobility of both species and therefore is not suitable for use in most interdiffusion systems. We have first modified this theory to take into account the mobility of both species and then further extended it to develop relations or the integrated diffusion coefficient and the ratio of diffusivities of the species. The versatility of these two different models is examined in the Co-Si system with respect to different end-member compositions. From our analysis, we found that the applicability of the theory of flux partitioning is rather limited but the theory of dissociation and reaction can be used in any binary system.
Resumo:
A method has been developed for the removal of chromium using ferrous sulphide generated in situ. The effects of experimental parameters such as pH, reagent dosages, interference from cations and chelating agents have been investigated. Under optimum conditions, removal efficiencies of 99 and 97% for synthetic and industrial samples have been obtained. The method offers all the advantages of sulphide precipitation process and can be adopted easily for industrial effluents.
Resumo:
A numerical model of the entire casting process starting from the mould filling stage to complete solidification is presented. The model takes into consideration any phase change taking place during the filling process. A volume of fluid method is used for tracking the metal–air interface during filling and an enthalpy based macro-scale solidification model is used for the phase change process. The model is demonstrated for the case of filling and solidification of Pb–15 wt%Sn alloy in a side-cooled two-dimensional rectangular cavity, and the resulting evolution of a mushy region and macrosegregation are studied. The effects of process parameters related to filling, namely degree of melt superheat and filling velocity on macrosegregation in the cavity, are also investigated. Results show significant differences in the progress of the mushy zone and macrosegregation pattern between this analysis and conventional analysis without the filling effect.
Resumo:
Boron carbide is produced in a heat resistance furnace using boric oxide and petroleum coke as the raw materials. The product yield is very low. Heat transfer plays an important role in the formation of boron carbide. Temperature at the core reaches up to 2600 K. No experimental study is available in the open literature for this high temperature process particularly in terms of temperature measurement and heat transfer. Therefore, a laboratory scale hot model of the process has been setup to measure the temperatures in harsh conditions at different locations in the furnace using various temperature measurement devices such as pyrometer and various types of thermocouple. Particular attention was paid towards the accuracy and reliability of the measured data. The recorded data were analysed to understand the heat transfer process inside the reactor and the effect of it on the formation of boron carbide.
Resumo:
Fluid bed granulation is a key pharmaceutical process which improves many of the powder properties for tablet compression. Dry mixing, wetting and drying phases are included in the fluid bed granulation process. Granules of high quality can be obtained by understanding and controlling the critical process parameters by timely measurements. Physical process measurements and particle size data of a fluid bed granulator that are analysed in an integrated manner are included in process analytical technologies (PAT). Recent regulatory guidelines strongly encourage the pharmaceutical industry to apply scientific and risk management approaches to the development of a product and its manufacturing process. The aim of this study was to utilise PAT tools to increase the process understanding of fluid bed granulation and drying. Inlet air humidity levels and granulation liquid feed affect powder moisture during fluid bed granulation. Moisture influences on many process, granule and tablet qualities. The approach in this thesis was to identify sources of variation that are mainly related to moisture. The aim was to determine correlations and relationships, and utilise the PAT and design space concepts for the fluid bed granulation and drying. Monitoring the material behaviour in a fluidised bed has traditionally relied on the observational ability and experience of an operator. There has been a lack of good criteria for characterising material behaviour during spraying and drying phases, even though the entire performance of a process and end product quality are dependent on it. The granules were produced in an instrumented bench-scale Glatt WSG5 fluid bed granulator. The effect of inlet air humidity and granulation liquid feed on the temperature measurements at different locations of a fluid bed granulator system were determined. This revealed dynamic changes in the measurements and enabled finding the most optimal sites for process control. The moisture originating from the granulation liquid and inlet air affected the temperature of the mass and pressure difference over granules. Moreover, the effects of inlet air humidity and granulation liquid feed rate on granule size were evaluated and compensatory techniques used to optimize particle size. Various end-point indication techniques of drying were compared. The ∆T method, which is based on thermodynamic principles, eliminated the effects of humidity variations and resulted in the most precise estimation of the drying end-point. The influence of fluidisation behaviour on drying end-point detection was determined. The feasibility of the ∆T method and thus the similarities of end-point moisture contents were found to be dependent on the variation in fluidisation between manufacturing batches. A novel parameter that describes behaviour of material in a fluid bed was developed. Flow rate of the process air and turbine fan speed were used to calculate this parameter and it was compared to the fluidisation behaviour and the particle size results. The design space process trajectories for smooth fluidisation based on the fluidisation parameters were determined. With this design space it is possible to avoid excessive fluidisation and improper fluidisation and bed collapse. Furthermore, various process phenomena and failure modes were observed with the in-line particle size analyser. Both rapid increase and a decrease in granule size could be monitored in a timely manner. The fluidisation parameter and the pressure difference over filters were also discovered to express particle size when the granules had been formed. The various physical parameters evaluated in this thesis give valuable information of fluid bed process performance and increase the process understanding.
Resumo:
In order to improve and continuously develop the quality of pharmaceutical products, the process analytical technology (PAT) framework has been adopted by the US Food and Drug Administration. One of the aims of PAT is to identify critical process parameters and their effect on the quality of the final product. Real time analysis of the process data enables better control of the processes to obtain a high quality product. The main purpose of this work was to monitor crucial pharmaceutical unit operations (from blending to coating) and to examine the effect of processing on solid-state transformations and physical properties. The tools used were near-infrared (NIR) and Raman spectroscopy combined with multivariate data analysis, as well as X-ray powder diffraction (XRPD) and terahertz pulsed imaging (TPI). To detect process-induced transformations in active pharmaceutical ingredients (APIs), samples were taken after blending, granulation, extrusion, spheronisation, and drying. These samples were monitored by XRPD, Raman, and NIR spectroscopy showing hydrate formation in the case of theophylline and nitrofurantoin. For erythromycin dihydrate formation of the isomorphic dehydrate was critical. Thus, the main focus was on the drying process. NIR spectroscopy was applied in-line during a fluid-bed drying process. Multivariate data analysis (principal component analysis) enabled detection of the dehydrate formation at temperatures above 45°C. Furthermore, a small-scale rotating plate device was tested to provide an insight into film coating. The process was monitored using NIR spectroscopy. A calibration model, using partial least squares regression, was set up and applied to data obtained by in-line NIR measurements of a coating drum process. The predicted coating thickness agreed with the measured coating thickness. For investigating the quality of film coatings TPI was used to create a 3-D image of a coated tablet. With this technique it was possible to determine coating layer thickness, distribution, reproducibility, and uniformity. In addition, it was possible to localise defects of either the coating or the tablet. It can be concluded from this work that the applied techniques increased the understanding of physico-chemical properties of drugs and drug products during and after processing. They additionally provided useful information to improve and verify the quality of pharmaceutical dosage forms
Resumo:
Tutkielma käsittelee suomalaisten televisiotekstittäjien ammatillisuutta, käännösprosessia ja digitaalisten tekstitysohjelmien vaikutuksia tekstitysprosessiin ammattitekstittäjien näkökulmasta. Suomen television digitalisoituminen on aiheuttanut mullistuksia myös tekstitysalalla kun tekstitettävä kuvamateriaali on ryhdytty toimittamaan käännöstoimistoille ja tekstittäjille digitaalisena. Teoriaosuudessa käsitellään käännös- ja tekstitystutkimusta sekä koulutusta Suomessa, ammattitaitoa ja ammatillisuutta sekä kääntämisen apukeinoja. Tekstittäminen esitellään erikoistuneena kääntämisen muotona. On kuitenkin myös huomioitava, että kääntäminen on yksi vaihe tekstitysprosessissa. Teoriaosuus päättyy suomalaisten televisiotekstittäjien arjen ja työkentän nykytilanteen käsittelyyn – tekstittäjät työskentelevät monenlaisilla työehdoilla ja laadun kriteerit saatetaan joutua arvioimaan uudelleen. Empiirisen osan alussa esitetään, että suomalaisia televisiotekstittäjiä on haastateltu yllättävän vähän, ja Jääskeläisen ajatuksiin nojaten mainitaan, että tekstittämisen alalla on vielä paljon tutkimatta – etenkin suomalaisesta tekstitysprosessista löytyy tutkittavaa. Tutkimuskohde on ammatikseen televisioon tekstityksiä tekevät kääntäjät. Suomalaiselle tekstitykseen erikoistuneelle käännöstoimistolle työskenteleville tekstittäjille lähetettiin alkutalvesta 2008 kyselylomake, jolla kartoitettiin sekä monivalintakysymyksillä että avoimilla kysymyksillä heidän ammatillisuuttaan, työmenetelmiään, käännös- ja tekstitysprosessiaan, ammattiylpeyttään ja -identiteettiään, ajanhallintaansa, sekä heidän käyttämäänsä digitaalista tekstitysohjelmaa. Tutkimuksessa kävi ilmi, että lähes kolmanneksella vastaajista on ammatistaan neutraali tai jopa negatiivinen käsitys. Näitä tekstittäjiä yhdistää se seikka, että kaikilla on alle 5 vuotta kokemusta alalta. Valtaosa vastanneista on kuitenkin ylpeitä siitä, että toimivat suomen kielen ammattilaisina. Tekstitysprosessi oli lomakkeessa jaettu esikatseluvaiheeseen, käännösvaiheeseen, ajastamisvaiheeseen ja korjauskatseluvaiheeseen. Tekstittäjät pyydettiin mm. arvioimaan tekstitysprosessinsa kokonaiskestoa. Kestoissa ilmeni suuria eroavaisuuksia, joista ainakin osa korreloi kokemuksen kanssa. Runsas puolet vastaajista on hankkinut digitaalisen tekstitysohjelmiston käyttöönsä ja osa ajastaa edelleen käännöstoimistossa muun muassa ohjelmiston kalleuden vuoksi. Digitaalisen ohjelmiston myötä tekstitysprosessiin ja työkäytänteisiin on tullut muutoksia, kun videonauhureista ja televisioista on siirrytty pelkän tietokoneen käyttöön. On mahdollista tehdä etätyötä kaukomailta käsin, kääntää ja ajastaa lomittain tai tehdä esiajastus ja kääntää sitten. Digitaalinen tekniikka on siis mahdollistanut tekstitysprosessin muuttumisen ja vaihtoehtoiset työmenetelmät, mutta kaikista menetelmistä ei välttämättä ole tekstittäjälle hyötyä. Perinteinen tekstitysprosessi (esikatselu, repliikkijakojen merkitseminen käsikirjoitukseen, kääntäminen ja repliikkien laadinta, korjaukset ja tarkastuskatselu) vaikuttaa edelleen tehokkaimmalta. Vaikka työkäytänteet eroavat toisistaan, kokonaiskäsitys on se, että digitalisoitumisen alkukangertelujen jälkeen tekstittäjien työskentely on tehostunut.
Resumo:
In the thesis it is discussed in what ways concepts and methodology developed in evolutionary biology can be applied to the explanation and research of language change. The parallel nature of the mechanisms of biological evolution and language change is explored along with the history of the exchange of ideas between these two disciplines. Against this background computational methods developed in evolutionary biology are taken into consideration in terms of their applicability to the study of historical relationships between languages. Different phylogenetic methods are explained in common terminology, avoiding the technical language of statistics. The thesis is on one hand a synthesis of earlier scientific discussion, and on the other an attempt to map out the problems of earlier approaches in addition to finding new guidelines in the study of language change on their basis. Primarily literature about the connections between evolutionary biology and language change, along with research articles describing applications of phylogenetic methods into language change have been used as source material. The thesis starts out by describing the initial development of the disciplines of evolutionary biology and historical linguistics, a process which right from the beginning can be seen to have involved an exchange of ideas concerning the mechanisms of language change and biological evolution. The historical discussion lays the foundation for the handling of the generalised account of selection developed during the recent few decades. This account is aimed for creating a theoretical framework capable of explaining both biological evolution and cultural change as selection processes acting on self-replicating entities. This thesis focusses on the capacity of the generalised account of selection to describe language change as a process of this kind. In biology, the mechanisms of evolution are seen to form populations of genetically related organisms through time. One of the central questions explored in this thesis is whether selection theory makes it possible to picture languages are forming populations of a similar kind, and what a perspective like this can offer to the understanding of language in general. In historical linguistics, the comparative method and other, complementing methods have been traditionally used to study the development of languages from a common ancestral language. Computational, quantitative methods have not become widely used as part of the central methodology of historical linguistics. After the fading of a limited popularity enjoyed by the lexicostatistical method since the 1950s, only in the recent years have also the computational methods of phylogenetic inference used in evolutionary biology been applied to the study of early language history. In this thesis the possibilities offered by the traditional methodology of historical linguistics and the new phylogenetic methods are compared. The methods are approached through the ways in which they have been applied to the Indo-European languages, which is the most thoroughly investigated language family using both the traditional and the phylogenetic methods. The problems of these applications along with the optimal form of the linguistic data used in these methods are explored in the thesis. The mechanisms of biological evolution are seen in the thesis as parallel in a limited sense to the mechanisms of language change, however sufficiently so that the development of a generalised account of selection is deemed as possibly fruiful for understanding language change. These similarities are also seen to support the validity of using phylogenetic methods in the study of language history, although the use of linguistic data and the models of language change employed by these models are seen to await further development.
Resumo:
Space in musical semiosis is a study of musical meaning, spatiality and composition. Earlier studies on musical composition have not adequately treated the problems of musical signification. Here, composition is considered an epitomic process of musical signification. Hence the core problems of composition theory are core problems of musical semiotics. The study employs a framework of naturalist pragmatism, based on C. S. Peirce’s philosophy. It operates on concepts such as subject, experience, mind and inquiry, and incorporates relevant ideas of Aristotle, Peirce and John Dewey into a synthetic view of esthetic, practic, and semiotic for the benefit of grasping musical signification process as a case of semiosis in general. Based on expert accounts, music is depicted as real, communicative, representational, useful, embodied and non-arbitrary. These describe how music and the musical composition process are mental processes. Peirce’s theories are combined with current morphological theories of cognition into a view of mind, in which space is central. This requires an analysis of space, and the acceptance of a relativist understanding of spatiality. This approach to signification suggests that mental processes are spatially embodied, by virtue of hard facts of the world, literal representations of objects, as well as primary and complex metaphors each sharing identities of spatial structures. Consequently, music and the musical composition process are spatially embodied. Composing music appears as a process of constructing metaphors—as a praxis of shaping and reshaping features of sound, representable from simple quality dimensions to complex domains. In principle, any conceptual space, metaphorical or literal, may set off and steer elaboration, depending on the practical bearings on the habits of feeling, thinking and action, induced in musical communication. In this sense, it is evident that music helps us to reorganize our habits of feeling, thinking, and action. These habits, in turn, constitute our existence. The combination of Peirce and morphological approaches to cognition serves well for understanding musical and general signification. It appears both possible and worthwhile to address a variety of issues central to musicological inquiry in the framework of naturalist pragmatism. The study may also contribute to the development of Peircean semiotics.