981 resultados para Cutting process
Resumo:
Boron carbide is produced in a heat resistance furnace using boric oxide and petroleum coke as the raw materials. The product yield is very low. Heat transfer plays an important role in the formation of boron carbide. Temperature at the core reaches up to 2600 K. No experimental study is available in the open literature for this high temperature process particularly in terms of temperature measurement and heat transfer. Therefore, a laboratory scale hot model of the process has been setup to measure the temperatures in harsh conditions at different locations in the furnace using various temperature measurement devices such as pyrometer and various types of thermocouple. Particular attention was paid towards the accuracy and reliability of the measured data. The recorded data were analysed to understand the heat transfer process inside the reactor and the effect of it on the formation of boron carbide.
Resumo:
Fluid bed granulation is a key pharmaceutical process which improves many of the powder properties for tablet compression. Dry mixing, wetting and drying phases are included in the fluid bed granulation process. Granules of high quality can be obtained by understanding and controlling the critical process parameters by timely measurements. Physical process measurements and particle size data of a fluid bed granulator that are analysed in an integrated manner are included in process analytical technologies (PAT). Recent regulatory guidelines strongly encourage the pharmaceutical industry to apply scientific and risk management approaches to the development of a product and its manufacturing process. The aim of this study was to utilise PAT tools to increase the process understanding of fluid bed granulation and drying. Inlet air humidity levels and granulation liquid feed affect powder moisture during fluid bed granulation. Moisture influences on many process, granule and tablet qualities. The approach in this thesis was to identify sources of variation that are mainly related to moisture. The aim was to determine correlations and relationships, and utilise the PAT and design space concepts for the fluid bed granulation and drying. Monitoring the material behaviour in a fluidised bed has traditionally relied on the observational ability and experience of an operator. There has been a lack of good criteria for characterising material behaviour during spraying and drying phases, even though the entire performance of a process and end product quality are dependent on it. The granules were produced in an instrumented bench-scale Glatt WSG5 fluid bed granulator. The effect of inlet air humidity and granulation liquid feed on the temperature measurements at different locations of a fluid bed granulator system were determined. This revealed dynamic changes in the measurements and enabled finding the most optimal sites for process control. The moisture originating from the granulation liquid and inlet air affected the temperature of the mass and pressure difference over granules. Moreover, the effects of inlet air humidity and granulation liquid feed rate on granule size were evaluated and compensatory techniques used to optimize particle size. Various end-point indication techniques of drying were compared. The ∆T method, which is based on thermodynamic principles, eliminated the effects of humidity variations and resulted in the most precise estimation of the drying end-point. The influence of fluidisation behaviour on drying end-point detection was determined. The feasibility of the ∆T method and thus the similarities of end-point moisture contents were found to be dependent on the variation in fluidisation between manufacturing batches. A novel parameter that describes behaviour of material in a fluid bed was developed. Flow rate of the process air and turbine fan speed were used to calculate this parameter and it was compared to the fluidisation behaviour and the particle size results. The design space process trajectories for smooth fluidisation based on the fluidisation parameters were determined. With this design space it is possible to avoid excessive fluidisation and improper fluidisation and bed collapse. Furthermore, various process phenomena and failure modes were observed with the in-line particle size analyser. Both rapid increase and a decrease in granule size could be monitored in a timely manner. The fluidisation parameter and the pressure difference over filters were also discovered to express particle size when the granules had been formed. The various physical parameters evaluated in this thesis give valuable information of fluid bed process performance and increase the process understanding.
Resumo:
The number of drug substances in formulation development in the pharmaceutical industry is increasing. Some of these are amorphous drugs and have glass transition below ambient temperature, and thus they are usually difficult to formulate and handle. One reason for this is the reduced viscosity, related to the stickiness of the drug, that makes them complicated to handle in unit operations. Thus, the aim in this thesis was to develop a new processing method for a sticky amorphous model material. Furthermore, model materials were characterised before and after formulation, using several characterisation methods, to understand more precisely the prerequisites for physical stability of amorphous state against crystallisation. The model materials used were monoclinic paracetamol and citric acid anhydrate. Amorphous materials were prepared by melt quenching or by ethanol evaporation methods. The melt blends were found to have slightly higher viscosity than the ethanol evaporated materials. However, melt produced materials crystallised more easily upon consecutive shearing than ethanol evaporated materials. The only material that did not crystallise during shearing was a 50/50 (w/w, %) blend regardless of the preparation method and it was physically stable at least two years in dry conditions. Shearing at varying temperatures was established to measure the physical stability of amorphous materials in processing and storage conditions. The actual physical stability of the blends was better than the pure amorphous materials at ambient temperature. Molecular mobility was not related to the physical stability of the amorphous blends, observed as crystallisation. Molecular mobility of the 50/50 blend derived from a spectral linewidth as a function of temperature using solid state NMR correlated better with the molecular mobility derived from a rheometer than that of differential scanning calorimetry data. Based on the results obtained, the effect of molecular interactions, thermodynamic driving force and miscibility of the blends are discussed as the key factors to stabilise the blends. The stickiness was found to be affected glass transition and viscosity. Ultrasound extrusion and cutting were successfully tested to increase the processability of sticky material. Furthermore, it was found to be possible to process the physically stable 50/50 blend in a supercooled liquid state instead of a glassy state. The method was not found to accelerate the crystallisation. This may open up new possibilities to process amorphous materials that are otherwise impossible to manufacture into solid dosage forms.
Resumo:
In order to improve and continuously develop the quality of pharmaceutical products, the process analytical technology (PAT) framework has been adopted by the US Food and Drug Administration. One of the aims of PAT is to identify critical process parameters and their effect on the quality of the final product. Real time analysis of the process data enables better control of the processes to obtain a high quality product. The main purpose of this work was to monitor crucial pharmaceutical unit operations (from blending to coating) and to examine the effect of processing on solid-state transformations and physical properties. The tools used were near-infrared (NIR) and Raman spectroscopy combined with multivariate data analysis, as well as X-ray powder diffraction (XRPD) and terahertz pulsed imaging (TPI). To detect process-induced transformations in active pharmaceutical ingredients (APIs), samples were taken after blending, granulation, extrusion, spheronisation, and drying. These samples were monitored by XRPD, Raman, and NIR spectroscopy showing hydrate formation in the case of theophylline and nitrofurantoin. For erythromycin dihydrate formation of the isomorphic dehydrate was critical. Thus, the main focus was on the drying process. NIR spectroscopy was applied in-line during a fluid-bed drying process. Multivariate data analysis (principal component analysis) enabled detection of the dehydrate formation at temperatures above 45°C. Furthermore, a small-scale rotating plate device was tested to provide an insight into film coating. The process was monitored using NIR spectroscopy. A calibration model, using partial least squares regression, was set up and applied to data obtained by in-line NIR measurements of a coating drum process. The predicted coating thickness agreed with the measured coating thickness. For investigating the quality of film coatings TPI was used to create a 3-D image of a coated tablet. With this technique it was possible to determine coating layer thickness, distribution, reproducibility, and uniformity. In addition, it was possible to localise defects of either the coating or the tablet. It can be concluded from this work that the applied techniques increased the understanding of physico-chemical properties of drugs and drug products during and after processing. They additionally provided useful information to improve and verify the quality of pharmaceutical dosage forms
Resumo:
Tutkielma käsittelee suomalaisten televisiotekstittäjien ammatillisuutta, käännösprosessia ja digitaalisten tekstitysohjelmien vaikutuksia tekstitysprosessiin ammattitekstittäjien näkökulmasta. Suomen television digitalisoituminen on aiheuttanut mullistuksia myös tekstitysalalla kun tekstitettävä kuvamateriaali on ryhdytty toimittamaan käännöstoimistoille ja tekstittäjille digitaalisena. Teoriaosuudessa käsitellään käännös- ja tekstitystutkimusta sekä koulutusta Suomessa, ammattitaitoa ja ammatillisuutta sekä kääntämisen apukeinoja. Tekstittäminen esitellään erikoistuneena kääntämisen muotona. On kuitenkin myös huomioitava, että kääntäminen on yksi vaihe tekstitysprosessissa. Teoriaosuus päättyy suomalaisten televisiotekstittäjien arjen ja työkentän nykytilanteen käsittelyyn – tekstittäjät työskentelevät monenlaisilla työehdoilla ja laadun kriteerit saatetaan joutua arvioimaan uudelleen. Empiirisen osan alussa esitetään, että suomalaisia televisiotekstittäjiä on haastateltu yllättävän vähän, ja Jääskeläisen ajatuksiin nojaten mainitaan, että tekstittämisen alalla on vielä paljon tutkimatta – etenkin suomalaisesta tekstitysprosessista löytyy tutkittavaa. Tutkimuskohde on ammatikseen televisioon tekstityksiä tekevät kääntäjät. Suomalaiselle tekstitykseen erikoistuneelle käännöstoimistolle työskenteleville tekstittäjille lähetettiin alkutalvesta 2008 kyselylomake, jolla kartoitettiin sekä monivalintakysymyksillä että avoimilla kysymyksillä heidän ammatillisuuttaan, työmenetelmiään, käännös- ja tekstitysprosessiaan, ammattiylpeyttään ja -identiteettiään, ajanhallintaansa, sekä heidän käyttämäänsä digitaalista tekstitysohjelmaa. Tutkimuksessa kävi ilmi, että lähes kolmanneksella vastaajista on ammatistaan neutraali tai jopa negatiivinen käsitys. Näitä tekstittäjiä yhdistää se seikka, että kaikilla on alle 5 vuotta kokemusta alalta. Valtaosa vastanneista on kuitenkin ylpeitä siitä, että toimivat suomen kielen ammattilaisina. Tekstitysprosessi oli lomakkeessa jaettu esikatseluvaiheeseen, käännösvaiheeseen, ajastamisvaiheeseen ja korjauskatseluvaiheeseen. Tekstittäjät pyydettiin mm. arvioimaan tekstitysprosessinsa kokonaiskestoa. Kestoissa ilmeni suuria eroavaisuuksia, joista ainakin osa korreloi kokemuksen kanssa. Runsas puolet vastaajista on hankkinut digitaalisen tekstitysohjelmiston käyttöönsä ja osa ajastaa edelleen käännöstoimistossa muun muassa ohjelmiston kalleuden vuoksi. Digitaalisen ohjelmiston myötä tekstitysprosessiin ja työkäytänteisiin on tullut muutoksia, kun videonauhureista ja televisioista on siirrytty pelkän tietokoneen käyttöön. On mahdollista tehdä etätyötä kaukomailta käsin, kääntää ja ajastaa lomittain tai tehdä esiajastus ja kääntää sitten. Digitaalinen tekniikka on siis mahdollistanut tekstitysprosessin muuttumisen ja vaihtoehtoiset työmenetelmät, mutta kaikista menetelmistä ei välttämättä ole tekstittäjälle hyötyä. Perinteinen tekstitysprosessi (esikatselu, repliikkijakojen merkitseminen käsikirjoitukseen, kääntäminen ja repliikkien laadinta, korjaukset ja tarkastuskatselu) vaikuttaa edelleen tehokkaimmalta. Vaikka työkäytänteet eroavat toisistaan, kokonaiskäsitys on se, että digitalisoitumisen alkukangertelujen jälkeen tekstittäjien työskentely on tehostunut.
Resumo:
In the thesis it is discussed in what ways concepts and methodology developed in evolutionary biology can be applied to the explanation and research of language change. The parallel nature of the mechanisms of biological evolution and language change is explored along with the history of the exchange of ideas between these two disciplines. Against this background computational methods developed in evolutionary biology are taken into consideration in terms of their applicability to the study of historical relationships between languages. Different phylogenetic methods are explained in common terminology, avoiding the technical language of statistics. The thesis is on one hand a synthesis of earlier scientific discussion, and on the other an attempt to map out the problems of earlier approaches in addition to finding new guidelines in the study of language change on their basis. Primarily literature about the connections between evolutionary biology and language change, along with research articles describing applications of phylogenetic methods into language change have been used as source material. The thesis starts out by describing the initial development of the disciplines of evolutionary biology and historical linguistics, a process which right from the beginning can be seen to have involved an exchange of ideas concerning the mechanisms of language change and biological evolution. The historical discussion lays the foundation for the handling of the generalised account of selection developed during the recent few decades. This account is aimed for creating a theoretical framework capable of explaining both biological evolution and cultural change as selection processes acting on self-replicating entities. This thesis focusses on the capacity of the generalised account of selection to describe language change as a process of this kind. In biology, the mechanisms of evolution are seen to form populations of genetically related organisms through time. One of the central questions explored in this thesis is whether selection theory makes it possible to picture languages are forming populations of a similar kind, and what a perspective like this can offer to the understanding of language in general. In historical linguistics, the comparative method and other, complementing methods have been traditionally used to study the development of languages from a common ancestral language. Computational, quantitative methods have not become widely used as part of the central methodology of historical linguistics. After the fading of a limited popularity enjoyed by the lexicostatistical method since the 1950s, only in the recent years have also the computational methods of phylogenetic inference used in evolutionary biology been applied to the study of early language history. In this thesis the possibilities offered by the traditional methodology of historical linguistics and the new phylogenetic methods are compared. The methods are approached through the ways in which they have been applied to the Indo-European languages, which is the most thoroughly investigated language family using both the traditional and the phylogenetic methods. The problems of these applications along with the optimal form of the linguistic data used in these methods are explored in the thesis. The mechanisms of biological evolution are seen in the thesis as parallel in a limited sense to the mechanisms of language change, however sufficiently so that the development of a generalised account of selection is deemed as possibly fruiful for understanding language change. These similarities are also seen to support the validity of using phylogenetic methods in the study of language history, although the use of linguistic data and the models of language change employed by these models are seen to await further development.
Resumo:
Space in musical semiosis is a study of musical meaning, spatiality and composition. Earlier studies on musical composition have not adequately treated the problems of musical signification. Here, composition is considered an epitomic process of musical signification. Hence the core problems of composition theory are core problems of musical semiotics. The study employs a framework of naturalist pragmatism, based on C. S. Peirce’s philosophy. It operates on concepts such as subject, experience, mind and inquiry, and incorporates relevant ideas of Aristotle, Peirce and John Dewey into a synthetic view of esthetic, practic, and semiotic for the benefit of grasping musical signification process as a case of semiosis in general. Based on expert accounts, music is depicted as real, communicative, representational, useful, embodied and non-arbitrary. These describe how music and the musical composition process are mental processes. Peirce’s theories are combined with current morphological theories of cognition into a view of mind, in which space is central. This requires an analysis of space, and the acceptance of a relativist understanding of spatiality. This approach to signification suggests that mental processes are spatially embodied, by virtue of hard facts of the world, literal representations of objects, as well as primary and complex metaphors each sharing identities of spatial structures. Consequently, music and the musical composition process are spatially embodied. Composing music appears as a process of constructing metaphors—as a praxis of shaping and reshaping features of sound, representable from simple quality dimensions to complex domains. In principle, any conceptual space, metaphorical or literal, may set off and steer elaboration, depending on the practical bearings on the habits of feeling, thinking and action, induced in musical communication. In this sense, it is evident that music helps us to reorganize our habits of feeling, thinking, and action. These habits, in turn, constitute our existence. The combination of Peirce and morphological approaches to cognition serves well for understanding musical and general signification. It appears both possible and worthwhile to address a variety of issues central to musicological inquiry in the framework of naturalist pragmatism. The study may also contribute to the development of Peircean semiotics.
Resumo:
Abstract is not available.
Resumo:
The purpose of this study is to find a framework for a holistic approach to, and form a conceptual toolbox for, investigating changes in signs and in their interpretation. Charles S. Peirce s theory of signs in a communicative perspective is taken as a basis for the framework. The concern directing the study is the problem of a missing framework in analysing signs of visual artefacts from a holistic perspective as well as that of the missing conceptual tools. To discover the possibility of such a holistic approach to semiosic processes and to form a conceptual toolbox the following issues are discussed: i) how the many Objects with two aspects involved in Peirce s definition of sign-action, promote multiple semiosis arising from the same sign by the same Interpretant depending on the domination of the Objects; ii) in which way can the relation of the individual and society or group be made more apparent in the construction of the self since this construction is intertwined with the process of meaning-creation and interpretation; iii) how to account for the fundamental role of emotions in semiosis, and the relation of emotions with the often neglected topic of embodiment; iv) how to take into account the dynamic, mediating and processual nature of sign-action in analysing and understanding the changes in signs and in the interpretation of signs. An interdisciplinary approach is chosen for this dissertation. Concepts that developed within social psychology, developmental psychology, neurosciences and semiotics, are discussed. The common aspect of the approaches is that they in one way or another concentrate on mediation provided by signs in explaining human activity and cognition. The holistic approach and conceptual toolbox found are employed in a case study. This consists of an analysis of beer brands including a comparison of brands from two different cultures. It becomes clear that different theories and approaches have mutual affinities and do complement each other. In addition, the affinities in different disciplines somewhat provide credence to the various views. From the combined approach described, it becomes apparent that by the semiosic process, the emerging semiotic self intertwined with the Umwelt, including emotions, can be described. Seeing the interpretation and meaning-making through semiosis allows for the analysis of groups, taking into account the embodied and emotional component. It is concluded that emotions have a crucial role in all human activity, including so-called reflective thinking, and that emotions and embodiment should be consciously taken into account in analysing signs, the interpretation, and in changes of signs and interpretations from both the social and individual level. The analysis of the beer labels expresses well the intertwined nature of the relationship between signs, individual consumers and society. Many direct influences from society on the label design are found, and also some indirect attitude changes that become apparent from magazines, company reports, etc. In addition, the analysis brings up the issues of the unifying tendency of the visual artefacts of different cultures, but also demonstrates that the visual artefacts are able to hold the local signs and meanings, and sometimes are able to represent the local meanings although the signs have changed in the unifying process.
Resumo:
The dissertation examines Roman provincial administration and the phenomenon of territorial reorganisations of provinces during the Imperial period with special emphasis on the provinces of Arabia and Palaestina during the Later Roman period, i.e., from Diocletian (r. 284 305) to the accession of Phocas (602), in the light of imperial decision-making. Provinces were the basic unit of Roman rule, for centuries the only level of administration that existed between the emperor and the cities of the Empire. The significance of the territorial reorganisations that the provinces were subjected to during the Imperial period is thus of special interest. The approach to the phenomenon is threefold: firstly, attention is paid to the nature and constraints of the Roman system of provincial administration. Secondly, the phenomenon of territorial reorganisations is analysed on the macro-scale, and thirdly, a case study concerning the reorganisations of the provinces of Arabia and Palaestina is conducted. The study of the mechanisms of decision-making provides a foundation through which the collected data of all known major territorial reorganisations is interpreted. The data concerning reorganisations is also subjected to qualitative comparative analysis that provides a new perspective to the data in the form of statistical analysis that is sensitive to the complexities of individual cases. This analysis of imperial decision-making is based on a timeframe stretching from Augustus (r. 30 BC AD 14) to the accession of Phocas (602). The study identifies five distinct phases in the use of territorial reorganisations of the provinces. From Diocletian s reign there is a clear normative change that made territorial reorganisations a regular tool of administration for the decision-making elite for addressing a wide variety of qualitatively different concerns. From the beginning of the fifth century the use of territorial reorganisations rapidly diminishes. The two primary reasons for the decline in the use of reorganisations were the solidification of ecclesiastical power and interests connected to the extent of provinces, and the decline of the dioceses. The case study of Palaestina and Arabia identifies seven different territorial reorganisations from Diocletian to Phocas. Their existence not only testifies to wider imperial policies, but also shows sensitivity to local conditions and corresponds with the general picture of provincial reorganisations. The territorial reorganisations of the provinces reflect the proactive control of the Roman decision-making elite. The importance of reorganisations should be recognised more clearly as part of the normal imperial administration of the provinces and especially reflecting the functioning of dioceses.
Resumo:
This workshop introduces a range of process drama activities to develop students' critical literacy responses. Whilst children's picture books and process drama strategies have not traditionally been seen as sophisticated resources and strategies for developing students' critical literacy responses, this workshop shows teaching strategies that can be used in language instruction in primary classrooms with diverse student groups. The teaching activities include ‘attribute lists’, ‘sculptures’ and ‘freeze frames’.
Resumo:
The intent of this study was to design, document and implement a Quality Management System (QMS) into a laboratory that incorporated both research and development (R&D) and routine analytical activities. In addition, it was necessary for the QMS to be easily and efficiently maintained to: (a) provide documented evidence that would validate the system's compliance with a certifiable standard, (b) fit the purpose of the laboratory, (c) accommodate prevailing government policies and standards, and (d) promote positive outcomes for the laboratory through documentation and verification of the procedures and methodologies implemented. Initially, a matrix was developed that documented the standards' requirements and the necessary steps to be made to meet those requirements. The matrix provided a check mechanism on the progression of the system's development. In addition, it was later utilised in the Quality Manual as a reference tool for the location of full procedures documented elsewhere in the system. The necessary documentation to build and monitor the system consisted of a series of manuals along with forms that provided auditable evidence of the workings of the QMS. Quality Management (QM), in one form or another, has been in existence since the early 1900's. However, the question still remains: is it a good thing or just a bugbear? Many of the older style systems failed because they were designed by non-users, fiercely regulatory, restrictive and generally deemed to be an imposition. It is now considered important to foster a sense of ownership of the system by the people who use the system. The system's design must be tailored to best fit the purpose of the operations of the facility if maximum benefits to the organisation are to be gained.
Resumo:
There are renewed calls for end-user participation and the integration of local knowledge in agricultural research. In Australia, the response has included an increased emphasis on participatory on-farm research with farmers and commercial agronomists that tests accepted principals to answer practical local farming questions. However, this pursuit of greater relevance has often led to compromises in research designs, unclear results and frustration amongst farmers, commercial agronomists and Research Development and Extension (RDE) agency researchers. This paper reports on a series of pre-season planning workshops from `Doing successful on-farm research', a workshop-based initiative that provides guidelines and a series of interactive activities to plan better participatory on-farm research. The workshop approach helps people design on-farm research that is appropriate to their own needs and local conditions. It assists them to clearly identify their issues, develop specific research questions and decide the best approach to answer those questions with the appropriate rigour for their own situations. These `Doing successful on-farm research' workshops address four potential deficiencies in on-farm research and farming systems RDE more generally in Australia: (1) variable participation of scientists and farmers in on-farm research; (2) the lack of clear guidelines for effective participatory practice and on-farm research; (3) limited support for on-farm research beyond the intensive investigations conducted by RDE agencies and (4) limited support for industry and farmers to contextualise information and research outcomes for specific individual circumstances and faster adaptation of technology. This may be a valuable contribution to balancing the demands for both relevance and rigour in on-farm research in Australia. In "Ground–breaking Stuff’- Proceedings of the 13th Australian Society of Agronomy Conference, 10-14 September 2006, Perth, Western Australia.
Resumo:
The thesis studies the translation process for the laws of Finland as they are translated from Finnish into Swedish. The focus is on revision practices, norms and workplace procedures. The translation process studied covers three institutions and four revisions. In three separate studies the translation process is analyzed from the perspective of the translations, the institutions and the actors. The general theoretical framework is Descriptive Translation Studies. For the analysis of revisions made in versions of the Swedish translation of Finnish laws, a model is developed covering five grammatical categories (textual revisions, syntactic revisions, lexical revisions, morphological revisions and content revisions) and four norms (legal adequacy, correct translation, correct language and readability). A separate questionnaire-based study was carried out with translators and revisers at the three institutions. The results show that the number of revisions does not decrease during the translation process, and no division of labour can be seen at the different stages. This is somewhat surprising if the revision process is regarded as one of quality control. Instead, all revisers make revisions on every level of the text. Further, the revisions do not necessarily imply errors in the translations but are often the result of revisers following different norms for legal translation. The informal structure of the institutions and its impact on communication, visibility and workplace practices was studied from the perspective of organization theory. The results show weaknesses in the communicative situation, which affect the co-operation both between institutions and individuals. Individual attitudes towards norms and their relative authority also vary, in the sense that revisers largely prioritize legal adequacy whereas translators give linguistic norms a higher value. Further, multi-professional teamwork in the institutions studied shows a kind of teamwork based on individuals and institutions doing specific tasks with only little contact with others. This shows that the established definitions of teamwork, with people co-working in close contact with each other, cannot directly be applied to the workplace procedures in the translation process studied. Three new concepts are introduced: flerstegsrevidering (multi-stage revision), revideringskedja (revision chain) and normsyn (norm attitude). The study seeks to make a contribution to our knowledge of legal translation, translation processes, institutional translation, revision practices and translation norms for legal translation. Keywords: legal translation, translation of laws, institutional translation, revision, revision practices, norms, teamwork, organizational informal structure, translation process, translation sociology, multilingual.