16 resultados para information content

em Helda - Digital Repository of University of Helsinki


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Through this study I aim to portray connections between home and school through the patterns of thought and action shared in everyday life in a certain community. My observations are primarily based upon interviews, writings and artwork by people from home (N=32) and school (N=13) contexts. Through the stories told, I depict the characters and characteristic features of the home-school interaction by generations. According to the material, in the school days of the grandparents the focus was on discipline and order. For the parents, the focus had shifted towards knowledge, while for the pupils today, the focus lies on evaluation, through which the upbringing of the child is steered towards favourable outcomes. Teachers and those people at home hold partially different understandings of home-school interaction, both of its manifested forms and potentials. The forms of contact in use today are largely seen as one-sided. Yearning for openness and regularity is shared by both sides, yet understood differently. Common causes for failure are said to lie in plain human difficulties in communication and social interaction, but deeply rooted traditions regarding forms of contact also cast a shadow on the route to successful co-operation. This study started around the idea, that home-school interaction should be steered towards the ex-change of constructive ideas between both the home and school environments. Combining the dif-ferent views gives to something to build upon. To test this idea, I drafted a practice period, which was implemented in a small pre-school environment in the fall of 1997. My focus of interest in this project was on the handling of ordinary life information in the schools. So I combined individual views, patterns of knowledge and understanding of the world into the process of teaching. Works of art and writings by the informants worked as tools for information processing and as practical forms of building home-school interaction. Experiences from the pre-school environ-ment were later on echoed in constructing home-school interaction in five other schools. In both these projects, the teaching in the school was based on stories, thoughts and performances put to-gether by the parents, grandparents and children at home. During these processes, the material used in this study, consisting of artwork, writings and interviews (N=501), was collected. The data shows that information originating from the home environments was both a motivating and interesting addition to the teaching. There even was a sense of pride when assessing the seeds of knowledge from one’s own roots. In most cases and subjects, the homegrown information content was seamlessly connected to the functions of school and the curriculum. This project initiated thought processes between pupils and teachers, adults, children and parents, teachers and parents, and also between generations. It appeared that many of the subjects covered had not been raised before between the various participant groups. I have a special interest here in visual expression and its various contextual meanings. There art material portrays how content matter and characteristic features of the adult and parent contexts reflect in the works of the children. Another clearly noticeable factor in the art material is the impact of time-related traditions and functions on the means of visual expression. Comparing the visual material to the written material reveals variances of meaning and possibilities between these forms of expression. The visual material appears to be related especially to portraying objects, action and usage. Processing through that making of images was noted to bring back memories of concrete structures, details and also emotions. This process offered the child an intensive social connection with the adults. In some cases, with children and adults alike, this project brought forth an ongoing relation to visual expression. During this study I end up changing the concept to ‘home-school collaboration’. This widely used concept guides and outlines the interaction between schools and homes. In order to broaden the field of possibilities, I choose to use the concept ‘school-home interconnection’. This concept forms better grounds for forming varying impressions and practices when building interactive contexts. This concept places the responsibility of bridging the connection-gap in the schools. Through the experiences and innovations of thought gained from these projects, I form a model of pedagogy that embraces the idea of school-home interconnection and builds on the various impres-sions and expressions contained in it. In this model, school makes use of the experiences, thoughts and conceptions from the home environment. Various forms of expression are used to portray and process this information. This joint evaluation and observation evolves thought patterns both in school and at home. Keywords: percieving, visuality, visual culture, art and text, visual expression, art education, growth in interaction, home-school collaboration, school-home interconnection, school-home interaction model.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis addresses modeling of financial time series, especially stock market returns and daily price ranges. Modeling data of this kind can be approached with so-called multiplicative error models (MEM). These models nest several well known time series models such as GARCH, ACD and CARR models. They are able to capture many well established features of financial time series including volatility clustering and leptokurtosis. In contrast to these phenomena, different kinds of asymmetries have received relatively little attention in the existing literature. In this thesis asymmetries arise from various sources. They are observed in both conditional and unconditional distributions, for variables with non-negative values and for variables that have values on the real line. In the multivariate context asymmetries can be observed in the marginal distributions as well as in the relationships of the variables modeled. New methods for all these cases are proposed. Chapter 2 considers GARCH models and modeling of returns of two stock market indices. The chapter introduces the so-called generalized hyperbolic (GH) GARCH model to account for asymmetries in both conditional and unconditional distribution. In particular, two special cases of the GARCH-GH model which describe the data most accurately are proposed. They are found to improve the fit of the model when compared to symmetric GARCH models. The advantages of accounting for asymmetries are also observed through Value-at-Risk applications. Both theoretical and empirical contributions are provided in Chapter 3 of the thesis. In this chapter the so-called mixture conditional autoregressive range (MCARR) model is introduced, examined and applied to daily price ranges of the Hang Seng Index. The conditions for the strict and weak stationarity of the model as well as an expression for the autocorrelation function are obtained by writing the MCARR model as a first order autoregressive process with random coefficients. The chapter also introduces inverse gamma (IG) distribution to CARR models. The advantages of CARR-IG and MCARR-IG specifications over conventional CARR models are found in the empirical application both in- and out-of-sample. Chapter 4 discusses the simultaneous modeling of absolute returns and daily price ranges. In this part of the thesis a vector multiplicative error model (VMEM) with asymmetric Gumbel copula is found to provide substantial benefits over the existing VMEM models based on elliptical copulas. The proposed specification is able to capture the highly asymmetric dependence of the modeled variables thereby improving the performance of the model considerably. The economic significance of the results obtained is established when the information content of the volatility forecasts derived is examined.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data assimilation provides an initial atmospheric state, called the analysis, for Numerical Weather Prediction (NWP). This analysis consists of pressure, temperature, wind, and humidity on a three-dimensional NWP model grid. Data assimilation blends meteorological observations with the NWP model in a statistically optimal way. The objective of this thesis is to describe methodological development carried out in order to allow data assimilation of ground-based measurements of the Global Positioning System (GPS) into the High Resolution Limited Area Model (HIRLAM) NWP system. Geodetic processing produces observations of tropospheric delay. These observations can be processed either for vertical columns at each GPS receiver station, or for the individual propagation paths of the microwave signals. These alternative processing methods result in Zenith Total Delay (ZTD) and Slant Delay (SD) observations, respectively. ZTD and SD observations are of use in the analysis of atmospheric humidity. A method is introduced for estimation of the horizontal error covariance of ZTD observations. The method makes use of observation minus model background (OmB) sequences of ZTD and conventional observations. It is demonstrated that the ZTD observation error covariance is relatively large in station separations shorter than 200 km, but non-zero covariances also appear at considerably larger station separations. The relatively low density of radiosonde observing stations limits the ability of the proposed estimation method to resolve the shortest length-scales of error covariance. SD observations are shown to contain a statistically significant signal on the asymmetry of the atmospheric humidity field. However, the asymmetric component of SD is found to be nearly always smaller than the standard deviation of the SD observation error. SD observation modelling is described in detail, and other issues relating to SD data assimilation are also discussed. These include the determination of error statistics, the tuning of observation quality control and allowing the taking into account of local observation error correlation. The experiments made show that the data assimilation system is able to retrieve the asymmetric information content of hypothetical SD observations at a single receiver station. Moreover, the impact of real SD observations on humidity analysis is comparable to that of other observing systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Herbivorous insects, their host plants and natural enemies form the largest and most species-rich communities on earth. But what forces structure such communities? Do they represent random collections of species, or are they assembled by given rules? To address these questions, food webs offer excellent tools. As a result of their versatile information content, such webs have become the focus of intensive research over the last few decades. In this thesis, I study herbivore-parasitoid food webs from a new perspective: I construct multiple, quantitative food webs in a spatially explicit setting, at two different scales. Focusing on food webs consisting of specialist herbivores and their natural enemies on the pedunculate oak, Quercus robur, I examine consistency in food web structure across space and time, and how landscape context affects this structure. As an important methodological development, I use DNA barcoding to resolve potential cryptic species in the food webs, and to examine their effect on food web structure. I find that DNA barcoding changes our perception of species identity for as many as a third of the individuals, by reducing misidentifications and by resolving several cryptic species. In terms of the variation detected in food web structure, I find surprising consistency in both space and time. From a spatial perspective, landscape context leaves no detectable imprint on food web structure, while species richness declines significantly with decreasing connectivity. From a temporal perspective, food web structure remains predictable from year to year, despite considerable species turnover in local communities. The rate of such turnover varies between guilds and species within guilds. The factors best explaining these observations are abundant and common species, which have a quantitatively dominant imprint on overall structure, and suffer the lowest turnover. By contrast, rare species with little impact on food web structure exhibit the highest turnover rates. These patterns reveal important limitations of modern metrics of quantitative food web structure. While they accurately describe the overall topology of the web and its most significant interactions, they are disproportionately affected by species with given traits, and insensitive to the specific identity of species. As rare species have been shown to be important for food web stability, metrics depicting quantitative food web structure should then not be used as the sole descriptors of communities in a changing world. To detect and resolve the versatile imprint of global environmental change, one should rather use these metrics as one tool among several.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The increased availability of high frequency data sets have led to important new insights in understanding of financial markets. The use of high frequency data is interesting and persuasive, since it can reveal new information that cannot be seen in lower data aggregation. This dissertation explores some of the many important issues connected with the use, analysis and application of high frequency data. These include the effects of intraday seasonal, the behaviour of time varying volatility, the information content of various market data, and the issue of inter market linkages utilizing high frequency 5 minute observations from major European and the U.S stock indices, namely DAX30 of Germany, CAC40 of France, SMI of Switzerland, FTSE100 of the UK and SP500 of the U.S. The first essay in the dissertation shows that there are remarkable similarities in the intraday behaviour of conditional volatility across European equity markets. Moreover, the U.S macroeconomic news announcements have significant cross border effect on both, European equity returns and volatilities. The second essay reports substantial intraday return and volatility linkages across European stock indices of the UK and Germany. This relationship appears virtually unchanged by the presence or absence of the U.S stock market. However, the return correlation among the U.K and German markets rises significantly following the U.S stock market opening, which could largely be described as a contemporaneous effect. The third essay sheds light on market microstructure issues in which traders and market makers learn from watching market data, and it is this learning process that leads to price adjustments. This study concludes that trading volume plays an important role in explaining international return and volatility transmissions. The examination concerning asymmetry reveals that the impact of the positive volume changes is larger on foreign stock market volatility than the negative changes. The fourth and the final essay documents number of regularities in the pattern of intraday return volatility, trading volume and bid-ask spreads. This study also reports a contemporaneous and positive relationship between the intraday return volatility, bid ask spread and unexpected trading volume. These results verify the role of trading volume and bid ask quotes as proxies for information arrival in producing contemporaneous and subsequent intraday return volatility. Moreover, asymmetric effect of trading volume on conditional volatility is also confirmed. Overall, this dissertation explores the role of information in explaining the intraday return and volatility dynamics in international stock markets. The process through which the information is incorporated in stock prices is central to all information-based models. The intraday data facilitates the investigation that how information gets incorporated into security prices as a result of the trading behavior of informed and uninformed traders. Thus high frequency data appears critical in enhancing our understanding of intraday behavior of various stock markets’ variables as it has important implications for market participants, regulators and academic researchers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Modeling and forecasting of implied volatility (IV) is important to both practitioners and academics, especially in trading, pricing, hedging, and risk management activities, all of which require an accurate volatility. However, it has become challenging since the 1987 stock market crash, as implied volatilities (IVs) recovered from stock index options present two patterns: volatility smirk(skew) and volatility term-structure, if the two are examined at the same time, presents a rich implied volatility surface (IVS). This implies that the assumptions behind the Black-Scholes (1973) model do not hold empirically, as asset prices are mostly influenced by many underlying risk factors. This thesis, consists of four essays, is modeling and forecasting implied volatility in the presence of options markets’ empirical regularities. The first essay is modeling the dynamics IVS, it extends the Dumas, Fleming and Whaley (DFW) (1998) framework; for instance, using moneyness in the implied forward price and OTM put-call options on the FTSE100 index, a nonlinear optimization is used to estimate different models and thereby produce rich, smooth IVSs. Here, the constant-volatility model fails to explain the variations in the rich IVS. Next, it is found that three factors can explain about 69-88% of the variance in the IVS. Of this, on average, 56% is explained by the level factor, 15% by the term-structure factor, and the additional 7% by the jump-fear factor. The second essay proposes a quantile regression model for modeling contemporaneous asymmetric return-volatility relationship, which is the generalization of Hibbert et al. (2008) model. The results show strong negative asymmetric return-volatility relationship at various quantiles of IV distributions, it is monotonically increasing when moving from the median quantile to the uppermost quantile (i.e., 95%); therefore, OLS underestimates this relationship at upper quantiles. Additionally, the asymmetric relationship is more pronounced with the smirk (skew) adjusted volatility index measure in comparison to the old volatility index measure. Nonetheless, the volatility indices are ranked in terms of asymmetric volatility as follows: VIX, VSTOXX, VDAX, and VXN. The third essay examines the information content of the new-VDAX volatility index to forecast daily Value-at-Risk (VaR) estimates and compares its VaR forecasts with the forecasts of the Filtered Historical Simulation and RiskMetrics. All daily VaR models are then backtested from 1992-2009 using unconditional, independence, conditional coverage, and quadratic-score tests. It is found that the VDAX subsumes almost all information required for the volatility of daily VaR forecasts for a portfolio of the DAX30 index; implied-VaR models outperform all other VaR models. The fourth essay models the risk factors driving the swaption IVs. It is found that three factors can explain 94-97% of the variation in each of the EUR, USD, and GBP swaption IVs. There are significant linkages across factors, and bi-directional causality is at work between the factors implied by EUR and USD swaption IVs. Furthermore, the factors implied by EUR and USD IVs respond to each others’ shocks; however, surprisingly, GBP does not affect them. Second, the string market model calibration results show it can efficiently reproduce (or forecast) the volatility surface for each of the swaptions markets.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this thesis is to examine the role of trade durations in price discovery. The motivation to use trade durations in the study of price discovery is that durations are robust to many microstructure effects that introduce a bias in the measurement of returns volatility. Another motivation to use trade durations in the study of price discovery is that it is difficult to think of economic variables, which really are useful in the determination of the source of volatility at arbitrarily high frequencies. The dissertation contains three essays. In the first essay, the role of trade durations in price discovery is examined with respect to the volatility pattern of stock returns. The theory on volatility is associated with the theory on the information content of trade, dear to the market microstructure theory. The first essay documents that the volatility per transaction is related to the intensity of trade, and a strong relationship between the stochastic process of trade durations and trading variables. In the second essay, the role of trade durations in price discovery is examined with respect to the quantification of risk due to a trading volume of a certain size. The theory on volume is intrinsically associated with the stock volatility pattern. The essay documents that volatility increases, in general, when traders choose to trade with large transactions. In the third essay, the role of trade durations in price discovery is examined with respect to the information content of a trade. The theory on the information content of a trade is associated with the theory on the rate of price revisions in the market. The essay documents that short durations are associated with information. Thus, traders are compensated for responding quickly to information

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this paper is to improve option risk monitoring by examining the information content of implied volatility and by introducing the calculation of a single-sum expected risk exposure similar to the Value-at-Risk. The figure is calculated in two steps. First, there is a need to estimate the value of a portfolio of options for a number of different market scenarios, while the second step is to summarize the information content of the estimated scenarios into a single-sum risk measure. This involves the use of probability theory and return distributions, which confronts the user with the problems of non-normality in the return distribution of the underlying asset. Here the hyperbolic distribution is used to describe one alternative for dealing with heavy tails. Results indicate that the information content of implied volatility is useful when predicting future large returns in the underlying asset. Further, the hyperbolic distribution provides a good fit to historical returns enabling a more accurate definition of statistical intervals and extreme events.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Event-based systems are seen as good candidates for supporting distributed applications in dynamic and ubiquitous environments because they support decoupled and asynchronous many-to-many information dissemination. Event systems are widely used, because asynchronous messaging provides a flexible alternative to RPC (Remote Procedure Call). They are typically implemented using an overlay network of routers. A content-based router forwards event messages based on filters that are installed by subscribers and other routers. The filters are organized into a routing table in order to forward incoming events to proper subscribers and neighbouring routers. This thesis addresses the optimization of content-based routing tables organized using the covering relation and presents novel data structures and configurations for improving local and distributed operation. Data structures are needed for organizing filters into a routing table that supports efficient matching and runtime operation. We present novel results on dynamic filter merging and the integration of filter merging with content-based routing tables. In addition, the thesis examines the cost of client mobility using different protocols and routing topologies. We also present a new matching technique called temporal subspace matching. The technique combines two new features. The first feature, temporal operation, supports notifications, or content profiles, that persist in time. The second feature, subspace matching, allows more expressive semantics, because notifications may contain intervals and be defined as subspaces of the content space. We also present an application of temporal subspace matching pertaining to metadata-based continuous collection and object tracking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Online content services can greatly benefit from personalisation features that enable delivery of content that is suited to each user's specific interests. This thesis presents a system that applies text analysis and user modeling techniques in an online news service for the purpose of personalisation and user interest analysis. The system creates a detailed thematic profile for each content item and observes user's actions towards content items to learn user's preferences. A handcrafted taxonomy of concepts, or ontology, is used in profile formation to extract relevant concepts from the text. User preference learning is automatic and there is no need for explicit preference settings or ratings from the user. Learned user profiles are segmented into interest groups using clustering techniques with the objective of providing a source of information for the service provider. Some theoretical background for chosen techniques is presented while the main focus is in finding practical solutions to some of the current information needs, which are not optimally served with traditional techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research question of this thesis was how knowledge can be managed with information systems. Information systems can support but not replace knowledge management. Systems can mainly store epistemic organisational knowledge included in content, and process data and information. Certain value can be achieved by adding communication technology to systems. All communication, however, can not be managed. A new layer between communication and manageable information was named as knowformation. Knowledge management literature was surveyed, together with information species from philosophy, physics, communication theory, and information system science. Positivism, post-positivism, and critical theory were studied, but knowformation in extended organisational memory seemed to be socially constructed. A memory management model of an extended enterprise (M3.exe) and knowformation concept were findings from iterative case studies, covering data, information and knowledge management systems. The cases varied from groups towards extended organisation. Systems were investigated, and administrators, users (knowledge workers) and managers interviewed. The model building required alternative sets of data, information and knowledge, instead of using the traditional pyramid. Also the explicit-tacit dichotomy was reconsidered. As human knowledge is the final aim of all data and information in the systems, the distinction between management of information vs. management of people was harmonised. Information systems were classified as the core of organisational memory. The content of the systems is in practice between communication and presentation. Firstly, the epistemic criterion of knowledge is not required neither in the knowledge management literature, nor from the content of the systems. Secondly, systems deal mostly with containers, and the knowledge management literature with applied knowledge. Also the construction of reality based on the system content and communication supports the knowformation concept. Knowformation belongs to memory management model of an extended enterprise (M3.exe) that is divided into horizontal and vertical key dimensions. Vertically, processes deal with content that can be managed, whereas communication can be supported, mainly by infrastructure. Horizontally, the right hand side of the model contains systems, and the left hand side content, which should be independent from each other. A strategy based on the model was defined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Powders are essential materials in the pharmaceutical industry, being involved in majority of all drug manufacturing. Powder flow and particle size are central particle properties addressed by means of particle engineering. The aim of the thesis was to gain knowledge on powder processing with restricted liquid addition, with a primary focus on particle coating and early granule growth. Furthermore, characterisation of this kind of processes was performed. A thin coating layer of hydroxypropyl methylcellulose was applied on individual particles of ibuprofen in a fluidised bed top-spray process. The polymeric coating improved the flow properties of the powder. The improvement was strongly related to relative humidity, which can be seen as an indicator of a change in surface hydrophilicity caused by the coating. The ibuprofen used in the present study had a d50 of 40 μm and thus belongs to the Geldart group C powders, which can be considered as challenging materials in top-spray coating processes. Ibuprofen was similarly coated using a novel ultrasound-assisted coating method. The results were in line with those obtained from powders coated in the fluidised bed process mentioned above. It was found that the ultrasound-assisted method was capable of coating single particles with a simple and robust setup. Granule growth in a fluidised bed process was inhibited by feeding the liquid in pulses. The results showed that the length of the pulsing cycles is of importance, and can be used to adjust granule growth. Moreover, pulsed liquid feed was found to be of greater significance to granule growth in high inlet air relative humidity. Liquid feed pulsing can thus be used as a tool in particle size targeting in fluidised bed processes and in compensating for changes in relative humidity of the inlet air. The nozzle function of a two-fluid external mixing pneumatic nozzle, typical for small scale pharmaceutical fluidised bed processes, was studied in situ in an ongoing fluidised bed process with particle tracking velocimetry. It was found that the liquid droplets undergo coalescence as they proceed away from the nozzle head. The coalescence was expected to increase droplet speed, which was confirmed in the study. The spray turbulence was studied, and the results showed turbulence caused by the event of atomisation and by the oppositely directed fluidising air. It was concluded that particle tracking velocimetry is a suitable tool for in situ spray characterisation. The light transmission through dense particulate systems was found to carry information on particle size and packing density as expected based on the theory of light scattering by solids. It was possible to differentiate binary blends consisting of components with differences in optical properties. Light transmission showed potential as a rapid, simple and inexpensive tool in characterisation of particulate systems giving information on changes in particle systems, which could be utilised in basic process diagnostics.