32 resultados para Critical Initial Approximations

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluid bed granulation is a key pharmaceutical process which improves many of the powder properties for tablet compression. Dry mixing, wetting and drying phases are included in the fluid bed granulation process. Granules of high quality can be obtained by understanding and controlling the critical process parameters by timely measurements. Physical process measurements and particle size data of a fluid bed granulator that are analysed in an integrated manner are included in process analytical technologies (PAT). Recent regulatory guidelines strongly encourage the pharmaceutical industry to apply scientific and risk management approaches to the development of a product and its manufacturing process. The aim of this study was to utilise PAT tools to increase the process understanding of fluid bed granulation and drying. Inlet air humidity levels and granulation liquid feed affect powder moisture during fluid bed granulation. Moisture influences on many process, granule and tablet qualities. The approach in this thesis was to identify sources of variation that are mainly related to moisture. The aim was to determine correlations and relationships, and utilise the PAT and design space concepts for the fluid bed granulation and drying. Monitoring the material behaviour in a fluidised bed has traditionally relied on the observational ability and experience of an operator. There has been a lack of good criteria for characterising material behaviour during spraying and drying phases, even though the entire performance of a process and end product quality are dependent on it. The granules were produced in an instrumented bench-scale Glatt WSG5 fluid bed granulator. The effect of inlet air humidity and granulation liquid feed on the temperature measurements at different locations of a fluid bed granulator system were determined. This revealed dynamic changes in the measurements and enabled finding the most optimal sites for process control. The moisture originating from the granulation liquid and inlet air affected the temperature of the mass and pressure difference over granules. Moreover, the effects of inlet air humidity and granulation liquid feed rate on granule size were evaluated and compensatory techniques used to optimize particle size. Various end-point indication techniques of drying were compared. The ∆T method, which is based on thermodynamic principles, eliminated the effects of humidity variations and resulted in the most precise estimation of the drying end-point. The influence of fluidisation behaviour on drying end-point detection was determined. The feasibility of the ∆T method and thus the similarities of end-point moisture contents were found to be dependent on the variation in fluidisation between manufacturing batches. A novel parameter that describes behaviour of material in a fluid bed was developed. Flow rate of the process air and turbine fan speed were used to calculate this parameter and it was compared to the fluidisation behaviour and the particle size results. The design space process trajectories for smooth fluidisation based on the fluidisation parameters were determined. With this design space it is possible to avoid excessive fluidisation and improper fluidisation and bed collapse. Furthermore, various process phenomena and failure modes were observed with the in-line particle size analyser. Both rapid increase and a decrease in granule size could be monitored in a timely manner. The fluidisation parameter and the pressure difference over filters were also discovered to express particle size when the granules had been formed. The various physical parameters evaluated in this thesis give valuable information of fluid bed process performance and increase the process understanding.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work was the assessment about the structure and use of the conceptual model of occlusion in operational weather forecasting. In the beginning a survey has been made about the conceptual model of occlusion as introduced to operational forecasters in the Finnish Meteorological Institute (FMI). In the same context an overview has been performed about the use of the conceptual model in modern operational weather forecasting, especially in connection with the widespread use of numerical forecasts. In order to evaluate the features of the occlusions in operational weather forecasting, all the occlusion processes occurring during year 2003 over Europe and Northern Atlantic area have been investigated using the conceptual model of occlusion and the methods suggested in the FMI. The investigation has yielded a classification of the occluded cyclones on the basis of the extent the conceptual model has fitted the description of the observed thermal structure. The seasonal and geographical distribution of the classes has been inspected. Some relevant cases belonging to different classes have been collected and analyzed in detail: in this deeper investigation tools and techniques, which are not routinely used in operational weather forecasting, have been adopted. Both the statistical investigation of the occluded cyclones during year 2003 and the case studies have revealed that the traditional classification of the types of the occlusion on the basis of the thermal structure doesn t take into account the bigger variety of occlusion structures which can be observed. Moreover the conceptual model of occlusion has turned out to be often inadequate in describing well developed cyclones. A deep and constructive revision of the conceptual model of occlusion is therefore suggested in light of the result obtained in this work. The revision should take into account both the progresses which are being made in building a theoretical footing for the occlusion process and the recent tools and meteorological quantities which are nowadays available.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term monitoring data collected from wild smolts of Atlantic salmon (Salmo salar) in the Simojoki river, northern Finland, were used in studying the relationships between the smolt size and age, smolt and postsmolt migration, environmental conditions and postsmolt survival. The onset of the smolt run was significantly dependent on the rising water temperature and decreasing discharge of the river in the spring. The mean length of smolts migrating early in the season was commonly higher and the mean age always older than among smolts migrating later. Many of the smolts migrating early in the season and almost all smolts migrating later had started their new growth in spring in the river before their sea entry. Among postsmolts, the time required for emigration from the estuary was dependent on the sea surface temperature (SST) off the river, being significantly shorter in years with warm than cold sea temperatures. After leaving the estuary, the postsmolts migrated southwards along the eastern coast of the northern Gulf of Bothnia, the geographical distribution of the tag recoveries coinciding with the warm thermal zone in spring in the coastal area. After arriving in the southern Gulf of Bothnia in late summer the postsmolts mostly migrated near the western coast, reaching the Baltic Main Basin in late autumn. Until the early 1990s there was only a weak positive association between smolt length and postsmolt survival. However, following a subsequent decrease in the mean smolt size, a significant positive dependence was observed between smolt size and the reported recapture rate of tagged salmon. The differences in recapture rates between smolts tagged during the first and second half of the annual migration season were insignificant, indicating that the seasonal variation in smolt size and age seem to be too small to affect survival. Among the climatic factors examined, the summer SST in the Gulf of Bothnia was most clearly related to the survival of the wild postsmolts. Postsmolt survival appeared to be highest in years when the SST in June in the Bothnian Bay varied between 9 and 12 ºC. In addition, the survival of wild postsmolts showed a significant positive dependence on the SST in July in the Bothnian Sea, but not on the abundance of the prey fish (0+ herring, Clupea harengus and sprat, Sprattus sprattus) in the Bothnian Sea and in the Baltic Main Basin. The results suggest, that if the incidence of extreme weather conditions were to increase due to climatic changes, it would probably reduce the postsmolt survival of wild salmon populations. For improving the performance of hatchery-reared smolts, it could be useful to examine opportunities to produce smolts that are in their smolt traits and abilities more similar to the wild smolts described in this thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The continuous production of blood cells, a process termed hematopoiesis, is sustained throughout the lifetime of an individual by a relatively small population of cells known as hematopoietic stem cells (HSCs). HSCs are unique cells characterized by their ability to self-renew and give rise to all types of mature blood cells. Given their high proliferative potential, HSCs need to be tightly regulated on the cellular and molecular levels or could otherwise turn malignant. On the other hand, the tight regulatory control of HSC function also translates into difficulties in culturing and expanding HSCs in vitro. In fact, it is currently not possible to maintain or expand HSCs ex vivo without rapid loss of self-renewal. Increased knowledge of the unique features of important HSC niches and of key transcriptional regulatory programs that govern HSC behavior is thus needed. Additional insight in the mechanisms of stem cell formation could enable us to recapitulate the processes of HSC formation and self-renewal/expansion ex vivo with the ultimate goal of creating an unlimited supply of HSCs from e.g. human embryonic stem cells (hESCs) or induced pluripotent stem cells (iPS) to be used in therapy. We thus asked: How are hematopoietic stem cells formed and in what cellular niches does this happen (Papers I, II)? What are the molecular mechanisms that govern hematopoietic stem cell development and differentiation (Papers III, IV)? Importantly, we could show that placenta is a major fetal hematopoietic niche that harbors a large number of HSCs during midgestation (Paper I)(Gekas et al., 2005). In order to address whether the HSCs found in placenta were formed there we utilized the Runx1-LacZ knock-in and Ncx1 knockout mouse models (Paper II). Importantly, we could show that HSCs emerge de novo in the placental vasculature in the absence of circulation (Rhodes et al., 2008). Furthermore, we could identify defined microenvironmental niches within the placenta with distinct roles in hematopoiesis: the large vessels of the chorioallantoic mesenchyme serve as sites of HSC generation whereas the placental labyrinth is a niche supporting HSC expansion (Rhodes et al., 2008). Overall, these studies illustrate the importance of distinct milieus in the emergence and subsequent maturation of HSCs. To ensure proper function of HSCs several regulatory mechanisms are in place. The microenvironment in which HSCs reside provides soluble factors and cell-cell interactions. In the cell-nucleus, these cell-extrinsic cues are interpreted in the context of cell-intrinsic developmental programs which are governed by transcription factors. An essential transcription factor for initiation of hematopoiesis is Scl/Tal1 (stem cell leukemia gene/T-cell acute leukemia gene 1). Loss of Scl results in early embryonic death and total lack of all blood cells, yet deactivation of Scl in the adult does not affect HSC function (Mikkola et al., 2003b. In order to define the temporal window of Scl requirement during fetal hematopoietic development, we deactivated Scl in all hematopoietic lineages shortly after hematopoietic specification in the embryo . Interestingly, maturation, expansion and function of fetal HSCs was unaffected, and, as in the adult, red blood cell and platelet differentiation was impaired (Paper III)(Schlaeger et al., 2005). These findings highlight that, once specified, the hematopoietic fate is stable even in the absence of Scl and is maintained through mechanisms that are distinct from those required for the initial fate choice. As the critical downstream targets of Scl remain unknown, we sought to identify and characterize target genes of Scl (Paper IV). We could identify transcription factor Mef2C (myocyte enhancer factor 2 C) as a novel direct target gene of Scl specifically in the megakaryocyte lineage which largely explains the megakaryocyte defect observed in Scl deficient mice. In addition, we observed an Scl-independent requirement of Mef2C in the B-cell compartment, as loss of Mef2C leads to accelerated B-cell aging (Gekas et al. Submitted). Taken together, these studies identify key extracellular microenvironments and intracellular transcriptional regulators that dictate different stages of HSC development, from emergence to lineage choice to aging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Molecular Adsorbent Recirculating System (MARS) is an extracorporeal albumin dialysis device which is used in the treatment of liver failure patients. This treatment was first utilized in Finland in 2001, and since then, over 200 patients have been treated. The aim of this thesis was to evaluate the impact of the MARS treatment on patient outcome, the clinical and biochemical variables, as well as on the psychological and economic aspects of the treatment in Finland. This thesis encompasses 195 MARS-treated patients (including patients with acute liver failure (ALF), acute-on-chronic liver failure (AOCLF) and graft failure), and a historical control group of 46 ALF patients who did not undergo MARS. All patients received a similar standard medical therapy at the same intensive care unit. The baseline data (demographics, laboratory and clinical variables) and MARS treatment-related and health-related quality-of-life data were recorded before and after treatment. The direct medical costs were determined for a period of 3.5 years.Additionally, the outcome of patients (survival, native liver recovery and need for liver transplantation) and survival predicting factors were investigated. In the outcome analysis, for the MARS-treated ALF patients, their 6-month survival (75% vs. 61%, P=0.07) and their native liver recovery rate (49% vs. 17%, P<0.001) were higher, and their need for transplantations was lower (29% vs. 57%, P= 0.001) than for the historical controls. However, the etiological distribution of the ALF patients referred to our unit has changed considerably over the past decade and the percentage of patients with a more favorable prognosis has increased. The etiology of liver failure was the most important predictor of the outcome. Other survival predicting factors in ALF included hepatic encephalopathy, the coagulation factors and the liver enzyme levels prior to MARS treatment. In terms of prognosis, the MARS treatment of the cirrhotic AOCLF patient seems meaningful only when the patient is eligible for transplantation. The MARS treatment appears to halt the progression of encephalopathy and reduce the blood concentration of neuroactive amino acids, albumin-bound and water-soluble toxins. In general, the effects of the MARS treatment seem to stabilize the patients, thus allowing additional time either for the native liver to recover, or for the patients to endure the prolonged waiting for transplantation. Furthermore, for the ALF patients, the MARS treatment appeared to be less costly and more cost-efficient than the standard medical therapy alone. In conclusion, the MARS treatment appears to have a beneficial effect on the patient outcome in ALF and in those AOCLF patients who can be bridged to transplantation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern-day weather forecasting is highly dependent on Numerical Weather Prediction (NWP) models as the main data source. The evolving state of the atmosphere with time can be numerically predicted by solving a set of hydrodynamic equations, if the initial state is known. However, such a modelling approach always contains approximations that by and large depend on the purpose of use and resolution of the models. Present-day NWP systems operate with horizontal model resolutions in the range from about 40 km to 10 km. Recently, the aim has been to reach operationally to scales of 1 4 km. This requires less approximations in the model equations, more complex treatment of physical processes and, furthermore, more computing power. This thesis concentrates on the physical parameterization methods used in high-resolution NWP models. The main emphasis is on the validation of the grid-size-dependent convection parameterization in the High Resolution Limited Area Model (HIRLAM) and on a comprehensive intercomparison of radiative-flux parameterizations. In addition, the problems related to wind prediction near the coastline are addressed with high-resolution meso-scale models. The grid-size-dependent convection parameterization is clearly beneficial for NWP models operating with a dense grid. Results show that the current convection scheme in HIRLAM is still applicable down to a 5.6 km grid size. However, with further improved model resolution, the tendency of the model to overestimate strong precipitation intensities increases in all the experiment runs. For the clear-sky longwave radiation parameterization, schemes used in NWP-models provide much better results in comparison with simple empirical schemes. On the other hand, for the shortwave part of the spectrum, the empirical schemes are more competitive for producing fairly accurate surface fluxes. Overall, even the complex radiation parameterization schemes used in NWP-models seem to be slightly too transparent for both long- and shortwave radiation in clear-sky conditions. For cloudy conditions, simple cloud correction functions are tested. In case of longwave radiation, the empirical cloud correction methods provide rather accurate results, whereas for shortwave radiation the benefit is only marginal. Idealised high-resolution two-dimensional meso-scale model experiments suggest that the reason for the observed formation of the afternoon low level jet (LLJ) over the Gulf of Finland is an inertial oscillation mechanism, when the large-scale flow is from the south-east or west directions. The LLJ is further enhanced by the sea-breeze circulation. A three-dimensional HIRLAM experiment, with a 7.7 km grid size, is able to generate a similar LLJ flow structure as suggested by the 2D-experiments and observations. It is also pointed out that improved model resolution does not necessary lead to better wind forecasts in the statistical sense. In nested systems, the quality of the large-scale host model is really important, especially if the inner meso-scale model domain is small.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aerosols impact the planet and our daily lives through various effects, perhaps most notably those related to their climatic and health-related consequences. While there are several primary particle sources, secondary new particle formation from precursor vapors is also known to be a frequent, global phenomenon. Nevertheless, the formation mechanism of new particles, as well as the vapors participating in the process, remain a mystery. This thesis consists of studies on new particle formation specifically from the point of view of numerical modeling. A dependence of formation rate of 3 nm particles on the sulphuric acid concentration to the power of 1-2 has been observed. This suggests nucleation mechanism to be of first or second order with respect to the sulphuric acid concentration, in other words the mechanisms based on activation or kinetic collision of clusters. However, model studies have had difficulties in replicating the small exponents observed in nature. The work done in this thesis indicates that the exponents may be lowered by the participation of a co-condensing (and potentially nucleating) low-volatility organic vapor, or by increasing the assumed size of the critical clusters. On the other hand, the presented new and more accurate method for determining the exponent indicates high diurnal variability. Additionally, these studies included several semi-empirical nucleation rate parameterizations as well as a detailed investigation of the analysis used to determine the apparent particle formation rate. Due to their high proportion of the earth's surface area, oceans could potentially prove to be climatically significant sources of secondary particles. In the lack of marine observation data, new particle formation events in a coastal region were parameterized and studied. Since the formation mechanism is believed to be similar, the new parameterization was applied in a marine scenario. The work showed that marine CCN production is feasible in the presence of additional vapors contributing to particle growth. Finally, a new method to estimate concentrations of condensing organics was developed. The algorithm utilizes a Markov chain Monte Carlo method to determine the required combination of vapor concentrations by comparing a measured particle size distribution with one from an aerosol dynamics process model. The evaluation indicated excellent agreement against model data, and initial results with field data appear sound as well.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis I examine one commonly used class of methods for the analytic approximation of cellular automata, the so-called local cluster approximations. This class subsumes the well known mean-field and pair approximations, as well as higher order generalizations of these. While a straightforward method known as Bayesian extension exists for constructing cluster approximations of arbitrary order on one-dimensional lattices (and certain other cases), for higher-dimensional systems the construction of approximations beyond the pair level becomes more complicated due to the presence of loops. In this thesis I describe the one-dimensional construction as well as a number of approximations suggested for higher-dimensional lattices, comparing them against a number of consistency criteria that such approximations could be expected to satisfy. I also outline a general variational principle for constructing consistent cluster approximations of arbitrary order with minimal bias, and show that the one-dimensional construction indeed satisfies this principle. Finally, I apply this variational principle to derive a novel consistent expression for symmetric three cell cluster frequencies as estimated from pair frequencies, and use this expression to construct a quantitatively improved pair approximation of the well-known lattice contact process on a hexagonal lattice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thin films are the basis of much of recent technological advance, ranging from coatings with mechanical or optical benefits to platforms for nanoscale electronics. In the latter, semiconductors have been the norm ever since silicon became the main construction material for a multitude of electronical components. The array of characteristics of silicon-based systems can be widened by manipulating the structure of the thin films at the nanoscale - for instance, by making them porous. The different characteristics of different films can then to some extent be combined by simple superposition. Thin films can be manufactured using many different methods. One emerging field is cluster beam deposition, where aggregates of hundreds or thousands of atoms are deposited one by one to form a layer, the characteristics of which depend on the parameters of deposition. One critical parameter is deposition energy, which dictates how porous, if at all, the layer becomes. Other parameters, such as sputtering rate and aggregation conditions, have an effect on the size and consistency of the individual clusters. Understanding nanoscale processes, which cannot be observed experimentally, is fundamental to optimizing experimental techniques and inventing new possibilities for advances at this scale. Atomistic computer simulations offer a window to the world of nanometers and nanoseconds in a way unparalleled by the most accurate of microscopes. Transmission electron microscope image simulations can then bridge this gap by providing a tangible link between the simulated and the experimental. In this thesis, the entire process of cluster beam deposition is explored using molecular dynamics and image simulations. The process begins with the formation of the clusters, which is investigated for Si/Ge in an Ar atmosphere. The structure of the clusters is optimized to bring it as close to the experimental ideal as possible. Then, clusters are deposited, one by one, onto a substrate, until a sufficiently thick layer has been produced. Finally, the concept is expanded by further deposition with different parameters, resulting in multiple superimposed layers of different porosities. This work demonstrates how the aggregation of clusters is not entirely understood within the scope of the approximations used in the simulations; yet, it is also shown how the continued deposition of clusters with a varying deposition energy can lead to a novel kind of nanostructured thin film: a multielemental porous multilayer. According to theory, these new structures have characteristics that can be tailored for a variety of applications, with precision heretofore unseen in conventional multilayer manufacture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We still know little of why strategy processes often involve participation problems. In this paper, we argue that this crucial issue is linked to fundamental assumptions about the nature of strategy work. Hence, we need to examine how strategy processes are typically made sense of and what roles are assigned to specific organizational members. For this purpose, we adopt a critical discursive perspective that allows us to discover how specific conceptions of strategy work are reproduced and legitimized in organizational strategizing. Our empirical analysis is based on an extensive research project on strategy work in 12 organizations. As a result of our analysis, we identify three central discourses that seem to be systematically associated with nonparticipatory approaches to strategy work: “mystification,” “disciplining,” and “technologization.” However, we also distinguish three strategy discourses that promote participation: “self-actualization,” “dialogization,” and “concretization.” Our analysis shows that strategy as practice involves alternative and even competing discourses that have fundamentally different kinds of implications for participation in strategy work. We argue from a critical perspective that it is important to be aware of the inherent problems associated with dominant discourses as well as to actively advance the use of alternative ones.