31 resultados para product features
Resumo:
The coagulation system of newborn infants differs markedly from that of older children and adults. The activities of most coagulation factors and anticoagulants are low, leading to altered regulation in the formation of the key enzyme, thrombin. Timely and adequate generation of thrombin is essential, as thrombin activates platelets and many coagulation factors, cleaves fibrinogen into fibrin and activates the antithrombotic and anti-inflammatory protein C pathway. On the other hand, excess thrombin may promote thrombotic complications and exacerbate harmful inflammatory reactions. Despite the characteristic features, the newborn coagulation system can be considered physiological, since healthy newborns rarely show haemorrhagic or thrombotic complications. Sick newborns, however, often encounter clinical situations that challenge their coagulation system. The aim of this study was to clarify the behaviour of the neonatal coagulation system in selected clinical situations, with a special emphasis on the generation of thrombin. Thrombin was measured by in vivo thrombin generation markers and by thrombin generation potential in vitro. The patient groups included sick newborns undergoing intensive care and receiving fresh-frozen plasma (FFP), requiring exchange transfusions (ET) or presenting with a congenital heart defect requiring open heart surgery. Additionally, healthy newborns with inherited heterozygous factor V Leiden (FVL) mutation were studied. Thrombin generation potential was also analysed in cord plasma of healthy infants and in adults. Healthy as well as sick newborn infants showed lower total thrombin generation potential in vitro but faster initiation of thrombin generation than adults. These findings were qualitatively similar when plasma was supplemented with platelets. Platelets, however, significantly altered the effect of the major anticoagulant, activated protein C (APC), on thrombin generation potential. In accordance with previous studies, thrombin generation in healthy newborn platelet-poor plasma was resistant to the anticoagulant effects of APC, but when the plasma was supplemented with platelets APC attenuated thrombin generation significantly more in newborns than in adults. In vivo generation of thrombin was elevated in nearly all of the sick newborn infants. The low-volume FFP transfusion as opposed to the change from neonatal to adult blood in ET exerted markedly different effects on neonatal thrombin generation. FFP reduced the in vivo generation of thrombin in those newborns with the highest pretransfusional thrombin generation, thus acting as an anticoagulant agent. In those infants with lower pretransfusional thrombin generation, the effect of FFP on thrombin generation was fairly neutral. On the other hand, the combination of red blood cells and FFP, used to perform ET, significantly increased the in vivo thrombin formation and shifted the balance in the newborn coagulation system to the procoagulant direction. Cardiopulmonary bypass (CPB) also significantly increased the in vivo thrombin generation, but the thrombin generation profile during CPB differed from that previously observed in adults. Escalation of thrombin at early reperfusion was not observed in newborns; in adults, its occurrence is associated with postoperative myocardial damage. Finally, in healthy newborns with FVL heterozygosity, faster initiation of thrombin generation was observed compared with controls. Interestingly, FV level was lower in FVL-heterozygous infants, possibly to counteract the procoagulant effects induced by FVL. In conclusion, unique features regarding thrombin regulation in newborn infants were observed. These features included a novel platelet effect on the regulation of the protein C pathway. The clinical challenges mainly seemed to shift the balance in the coagulation system of newborns to the procoagulant direction. Blood component transfusions markedly affected coagulation in a manner specific to the product but that could also be altered by the clinical situation. Overall, the results highlight the need for understanding developmental haemostasis for both diagnostic and therapeutic purposes.
Resumo:
The geomagnetic field is one of the most fundamental geophysical properties of the Earth and has significantly contributed to our understanding of the internal structure of the Earth and its evolution. Paleomagnetic and paleointensity data have been crucial in shaping concepts like continental drift, magnetic reversals, as well as estimating the time when the Earth's core and associated geodynamo processes begun. The work of this dissertation is based on reliable Proterozoic and Holocene geomagnetic field intensity data obtained from rocks and archeological artifacts. New archeomagnetic field intensity results are presented for Finland, Estonia, Bulgaria, Italy and Switzerland. The data were obtained using sophisticated laboratory setups as well as various reliability checks and corrections. Inter-laboratory comparisons between three laboratories (Helsinki, Sofia and Liverpool) were performed in order to check the reliability of different paleointensity methods. The new intensity results fill up considerable gaps in the master curves for each region investigated. In order to interpret the paleointensity data of the Holocene period, a novel and user-friendly database (GEOMAGIA50) was constructed. This provided a new tool to independently test the reliability of various techniques and materials used in paleointensity determinations. The results show that archeological artifacts, if well fired, are the most suitable materials. Also lavas yield reliable paleointensity results, although they appear more scattered. This study also shows that reliable estimates are obtained using the Thellier methodology (and its modifications) with reliability checks. Global paleointensity curves during Paleozoic and Proterozoic have several time gaps with few or no intensity data. To define the global intensity behavior of the Earth's magnetic field during these times new rock types (meteorite impact rocks) were investigated. Two case histories are presented. The Ilyinets (Ukraine) impact melt rocks yielded a reliable paleointensity value at 440 Ma (Silurian), whereas the results from Jnisjrvi impact melts (Russian Karelia, ca. 700 Ma) might be biased towards high intensity values because of non-ideal magnetic mineralogy. The features of the geomagnetic field at 1.1 Ga are not well defined due to problems related to reversal asymmetries observed in Keweenawan data of the Lake Superior region. In this work new paleomagnetic, paleosecular variation and paleointensity results are reported from coeval diabases from Central Arizona and help understanding the asymmetry. The results confirm the earlier preliminary observations that the asymmetry is larger in Arizona than in Lake Superior area. Two of the mechanisms proposed to explain the asymmetry remain plausible: the plate motion and the non-dipole influence.
Resumo:
Fusion energy is a clean and safe solution for the intricate question of how to produce non-polluting and sustainable energy for the constantly growing population. The fusion process does not result in any harmful waste or green-house gases, since small amounts of helium is the only bi-product that is produced when using the hydrogen isotopes deuterium and tritium as fuel. Moreover, deuterium is abundant in seawater and tritium can be bred from lithium, a common metal in the Earth's crust, rendering the fuel reservoirs practically bottomless. Due to its enormous mass, the Sun has been able to utilize fusion as its main energy source ever since it was born. But here on Earth, we must find other means to achieve the same. Inertial fusion involving powerful lasers and thermonuclear fusion employing extreme temperatures are examples of successful methods. However, these have yet to produce more energy than they consume. In thermonuclear fusion, the fuel is held inside a tokamak, which is a doughnut-shaped chamber with strong magnets wrapped around it. Once the fuel is heated up, it is controlled with the help of these magnets, since the required temperatures (over 100 million degrees C) will separate the electrons from the nuclei, forming a plasma. Once the fusion reactions occur, excess binding energy is released as energetic neutrons, which are absorbed in water in order to produce steam that runs turbines. Keeping the power losses from the plasma low, thus allowing for a high number of reactions, is a challenge. Another challenge is related to the reactor materials, since the confinement of the plasma particles is not perfect, resulting in particle bombardment of the reactor walls and structures. Material erosion and activation as well as plasma contamination are expected. Adding to this, the high energy neutrons will cause radiation damage in the materials, causing, for instance, swelling and embrittlement. In this thesis, the behaviour of a material situated in a fusion reactor was studied using molecular dynamics simulations. Simulations of processes in the next generation fusion reactor ITER include the reactor materials beryllium, carbon and tungsten as well as the plasma hydrogen isotopes. This means that interaction models, {\it i.e. interatomic potentials}, for this complicated quaternary system are needed. The task of finding such potentials is nonetheless nearly at its end, since models for the beryllium-carbon-hydrogen interactions were constructed in this thesis and as a continuation of that work, a beryllium-tungsten model is under development. These potentials are combinable with the earlier tungsten-carbon-hydrogen ones. The potentials were used to explain the chemical sputtering of beryllium due to deuterium plasma exposure. During experiments, a large fraction of the sputtered beryllium atoms were observed to be released as BeD molecules, and the simulations identified the swift chemical sputtering mechanism, previously not believed to be important in metals, as the underlying mechanism. Radiation damage in the reactor structural materials vanadium, iron and iron chromium, as well as in the wall material tungsten and the mixed alloy tungsten carbide, was also studied in this thesis. Interatomic potentials for vanadium, tungsten and iron were modified to be better suited for simulating collision cascades that are formed during particle irradiation, and the potential features affecting the resulting primary damage were identified. Including the often neglected electronic effects in the simulations was also shown to have an impact on the damage. With proper tuning of the electron-phonon interaction strength, experimentally measured quantities related to ion-beam mixing in iron could be reproduced. The damage in tungsten carbide alloys showed elemental asymmetry, as the major part of the damage consisted of carbon defects. On the other hand, modelling the damage in the iron chromium alloy, essentially representing steel, showed that small additions of chromium do not noticeably affect the primary damage in iron. Since a complete assessment of the response of a material in a future full-scale fusion reactor is not achievable using only experimental techniques, molecular dynamics simulations are of vital help. This thesis has not only provided insight into complicated reactor processes and improved current methods, but also offered tools for further simulations. It is therefore an important step towards making fusion energy more than a future goal.
Resumo:
Electronic document management (EDM) technology has the potential to enhance the information management in construction projects considerably, without radical changes to current practice. Over the past fifteen years this topic has been overshadowed by building product modelling in the construction IT research world, but at present EDM is quickly being introduced in practice, in particular in bigger projects. Often this is done in the form of third party services available over the World Wide Web. In the paper, a typology of research questions and methods is presented, which can be used to position the individual research efforts which are surveyed in the paper. Questions dealt with include: What features should EMD systems have? How much are they used? Are there benefits from use and how should these be measured? What are the barriers to wide-spread adoption? Which technical questions need to be solved? Is there scope for standardisation? How will the market for such systems evolve?
Resumo:
We all have fresh in our memory what happened to the IT sector only a few years ago when the IT-bubble burst. The upswing of productivity in this sector slowed down, investors lost large investments, many found themselves looking for a new job, and countless dreams fell apart. Product developers in the IT sector have experienced a large number of organizational restructurings since the IT boom, including rapid growth, downsizing processes, and structural reforms. Organizational restructurings seem to be a complex and continuous phenomenon people in this sector have to deal with. How do software product developers retrospectively construct their work in relation to organizational restructurings? How do organizational restructurings bring about specific social processes in product development? This working paper focuses on these questions. The overall aim is to develop an understanding of how software product developers construct their work during organizational restructurings. The theoretical frame of reference is based on a social constructionist approach and discourse analysis. This approach offers more or less radical and critical alternatives to mainstream organizational theory. Writings from this perspective attempt to investigate and understand sociocultural processes by which various realities are created. Therefore these studies aim at showing how people participate in constituting the social world (Gergen & Thatchenkery, 1996); knowledge of the world is seen to be constructed between people in daily interaction, in which language plays a central role. This means that interaction, especially the ways of talking and writing about product development during organizational restructurings, become the target of concern. This study consists of 25 in-depth interviews following a pilot study based on 57 semi-structured interviews. In this working paper I analyze 9 in-depth interviews. The interviews were conducted in eight IT firms. The analysis explores how discourses are constructed and function, as well as the consequences that follow from different discourses. The analysis shows that even though the product developers have experienced many organizational restructurings, some of which have been far-reaching, their accounts build strongly on a stability discourse. According to this discourse product development is, perhaps surprisingly, not influenced to a great extent by organizational restructurings. This does not mean that product development is static. According to the social constructionist approach, product development is constantly being reproduced and maintained in ongoing processes. In other words stable effects are also ongoing achievements and these are of particular interest in this study. The product developers maintain rather than change the product development through ongoing processes of construction, even when they experience continuous extensive organizational restructurings. The discourse of stability exists alongside other discourses, some which contradict each other. Together they direct product development and generate meanings. The product developers consequently take an active role in the construction of their work during organizational restructurings. When doing this they also negotiate credible positions for themselves
Resumo:
Many Finnish IT companies have gone through numerous organizational changes over the past decades. This book draws attention to how stability may be central to software product development experts and IT workers more generally, who continuously have to cope with such change in their workplaces. It does so by analyzing and theorizing change and stability as intertwined and co-existent, thus throwing light on how it is possible that, for example, even if the walls fall down the blokes just code and maintain a sense of stability in their daily work. Rather than reproducing the picture of software product development as exciting cutting edge activities and organizational change as dramatic episodes, the study takes the reader beyond the myths surrounding these phenomena to the mundane practices, routines and organizings in product development during organizational change. An analysis of these ordinary practices offers insights into how software product development experts actively engage in constructing stability during organizational change through a variety of practices, including solidarity, homosociality, close relations to products, instrumental or functional views on products, preoccupations with certain tasks and humble obedience. Consequently, the study shows that it may be more appropriate to talk about varieties of stability, characterized by a multitude of practices of stabilizing rather than states of stagnation. Looking at different practices of stability in depth shows the creation of software as an arena for micro-politics, power relations and increasing pressures for order and formalization. The thesis gives particular attention to power relations and processes of positioning following organizational change: how social actors come to understand themselves in the context of ongoing organizational change, how they comply with and/or contest dominant meanings, how they identify and dis-identify with formalization, and how power relations often are reproduced despite dis-identification. Related to processes of positioning, the reader is also given a glimpse into what being at work in a male-dominated and relatively homogeneous work environment looks like. It shows how the strong presence of men or blokes of a particular age and education seems to become invisible in workplace talk that appears non-conscious of gender.
Resumo:
Both inherited genetic variations and somatically acquired mutations drive cancer development. The aim of this thesis was to gain insight into the molecular mechanisms underlying colorectal cancer (CRC) predisposition and tumor progression. Whereas one-third of CRC may develop in the context of hereditary predisposition, the known highly penetrant syndromes only explain a small fraction of all cases. Genome-wide association studies have shown that ten common single nucleotide polymorphisms (SNPs) modestly predispose to CRC. Our population-based sample series of around thousand CRC cases and healthy controls was genotyped for these SNPs. Tumors of heterozygous patients were analyzed for allelic imbalance, in an attempt to reveal the role of these SNPs in somatic tumor progression. The risk allele of rs6983267 at 8q24 was favored in the tumors significantly more often than the neutral allele, indicating that this germline variant is somatically selected for. No imbalance targeting the risk allele was observed in the remaining loci, suggesting that most of the low-penetrance CRC SNPs mainly play a role in the early stages of the neoplastic process. The ten SNPs were further analyzed in 788 CRC cases, 97 of which had a family history of CRC, to evaluate their combined contribution. A significant association appeared between the overall number of risk alleles and familial CRC and these ten SNPs seem to explain around 9% of the familial clustering of CRC. Finding more CRC susceptibility alleles may facilitate individualized risk prediction and cancer prevention in the future. Microsatellite instability (MSI), resulting from defective mismatch repair function, is a hallmark of Lynch syndrome and observed in a subset of all CRCs. Our aim was to identify microsatellite frameshift mutations that inactivate tumor suppressor genes in MSI CRCs. By sequencing microsatellite repeats of underexpressed genes we found six novel MSI target genes that were frequently mutated in 100 MSI CRCs: 51% in GLYR1, 47% in ABCC5, 43% in WDTC1, 33% in ROCK1, 30% in OR51E2, and 28% in TCEB3. Immunohistochemical staining of GLYR1 revealed defective protein expression in homozygously mutated tumors, providing further support for the loss of function hypothesis. Another mutation screening effort sought to identify MSI target genes with putative oncogenic functions. Microsatellites were similarly sequenced in genes that were overexpressed and, upon mutation, predicted to avoid nonsense-mediated mRNA decay. The mitotic checkpoint kinase TTK harbored protein-elongating mutations in 59% of MSI CRCs and the mutant protein was detected in heterozygous MSI CRC cells. No checkpoint dysregulation or defective protein localization was observable however, and the biological relevance of this mutation may hence be related to other mechanisms. In conclusion, these two large-scale and unbiased efforts identified frequently mutated genes that are likely to contribute to the development of this cancer type and may be utilized in developing diagnostic and therapeutic applications.
Resumo:
Aim of this master's thesis paper for consumer economics, is to research gambling advertisements in Finland over a period of 35 years, from 1970 to 2006. Veikkaus Oy (later Veikkaus), was founded in 1940, as one of the three licensed gambling organizations in Finland. Material for the current research comprised 1494 advertisements published by Veikkaus in newspapers and magazines at that time. Veikkaus has the exclusive licence to organize lotto games, sport games, instant games and other draw games in Finland. The other two operators, The Finnish Slot Machine Association RAY and Fintoto (on-track horse betting), were not included in the current analysis. This study has been completed according to research contract and grand by the Finnish Foundation for Gaming Research (Pelitoiminnan tutkimussti). In general, advertisements reflect surrounding culture and time, and their message is built on stratified meanings, symbols and codes. Advertising draws the viewer's attention, introduces the advertised subject, and finally, affects the individual's consumption habits. However, advertisements not only work on individual level, but also influence public perception of the advertised product. Firstly, in order to assess gambling as a phenomenon, this paper discusses gambling as consumer behaviour, and also reviews history of gambling in Finland. Winning is a major feature of gambling, and dreaming about positive change of life is a centre of most gambling ads. However, perceived excitement through risk of losing can also be featured in gambling ads. Secondly, this study utilizes Veikkaus large advertising archives, were advertising data is analyzed by content analysis and the semiotic analysis. Two methods have been employed to support analyzing outcome in a synergistic way. Content analysis helps to achieve accuracy and comprehensiveness. Semiotic analysis allows deeper and more sensitive analysis to emerged findings and occurrences. It is important to understand the advertised product, as advertising is bound to the culture and time. Hence, to analyze advertising, it is important to understand the environment where the ads appear. Content analysis of Veikkaus data discovered the main gambling and principal advertisement style for each.period. Interestingly, nearly half of Veikkaus advertisements promoted topic other than just winning the bet. Games of change, like Lotto, typically advertised indirectly represented dreams about winning. In the category of skill gambling, features were represented as investment, and the excitement of sporting expertise was emphasized. In addition, there were a number of gambling ads that emphasize social responsibility of Veikkaus as a government guided organization. Semiotic methods were employed to further elaborate on findings of content analysis. Dreaming in the advertisements was represented by the product of symbols, (e.g. cars and homes) that were found to have significance connection with each other. Thus, advertising represents change of life obtained by the winning. Interestingly, gambling ads promoting jackpots were often representing religious symbolisms. Ads promoting social responsibility were found to be the most common during economical depression of the 90s. Deeper analysis showed that at that time, advertisements frequently represented depression-related meanings, such as unemployment and bank loans. Skill gaming ads were often represented by sports expertise late 90s, their number started sky rocketing, and continued increasing until 2006 (when this study ended). One may conclude that sport betting draws its meanings from the relevant consumer culture, and from the rules and features of the betted sport.
Resumo:
Merkittv osa alkuperislkevalmistajien tutkimus- ja tuotekehityskuluista nytt olevan suunnattu olemassa olevien lkkeiden kehittmiseen. Tm voi oletettavasti johtaa kiinnostaviin formulaatiokehitysstrategioihin. Tutkimuksen tarkoituksena oli selvitt, voidaanko farmaseuttisen tuotekehityksen trendej havaita mynnettyjen myyntilupien perusteella. Tutkimuksen mielenkiinnon kohteena olivat mys suurimpien lkeyritysten kyttmt elinkaaren hallinnan keinot, joilla suojataan myyvimpi tuotteita geneeriselt kilpailulta ja varmistetaan markkinaosuus. Tutkimuksen painopiste oli kiinteiss oraalisissa lkevalmisteissa. Laadullisten ja mrllisten menetelmien yhdistelm kytettiin laajan nkkulman saamiseksi tutkittavaan aiheeseen. Suomalaisten myyntilupaviranomaisten haastatteluja kytettiin kermn taustatietoa tutkimuksen mrllist osaa varten. Mrllinen osa koostui myyntilupatietokannoista, jotka ksittivt kaikkien menettelyjen kautta Suomessa mynnetyt myyntiluvat, keskitetyn menettelyn kautta EU:ssa mynnetyt myyntiluvat ja maailman kymmenen suurinta lkeyrityst USA:ssa. Tutkimustulosten perusteella rinnakkaislkkeiden mrss tapahtui merkittv nousu Suomessa kaikkien menettelyjen kautta mynnetyiss myyntiluvissa ja EU:ssa keskitetyn menettelyn kautta mynnetyiss myyntiluvissa vuosina 2000-2010. Tm muutos saattaa ainakin osaksi johtua lainsdnnllisist muutoksista, joilla luotiin kannustimia rinnakkaislkkeiden kyttn ja valmistukseen, kuten lkevaihto ja viitehintajrjestelm. USA:n tiedot osoittivat suurten lkevalmistajien kiinnostuksen elinkaaren hallintaan: suurin osa maailman kymmenelle suurimmalle lkeyritykselle mynnetyist myyntiluvista vuosina 2005-2010 oli thn tarkoitukseen. Elinkaaren hallinnan suhde uusiin lkeaineisiin oli lhes 4:1. Kiinte oraalinen lkemuoto on kiistatta kaikista suosituin tapa annostella lke, mink vahvistivat sek arvioijien haastattelut ett myyntilupatiedot. Kiinteiden oraalisten rooli oli entistkin korostuneempi rinnakkaislkkeiden kohdalla. Kun innovatiivisuutta mitattiin eptyypillisten annosmuotojen mrll, USA:n tiedot kiinteist oraalisista lkemuodoista osoittivat vahvaa innovatiivisuutta Suomen ja EU:n tietoihin verrattuna. Tm saattaa heijastaa suurten lkeyritysten innovatiivista tuotevalikoimaa. Eptyypillisten kiinteiden oraalisten annosmuotojen osuus oli huomattavasti pienempi rinnakkaislkkeiss kuin alkuperislkkeiss kaikilla alueilla. Elinkaaren hallinnassa kytetyimmt strategiat olivat uusi formulaatio, uusi vahvuus ja uusi yhdistelm olemassa olevasta valmisteesta. Kiinteiden oraalisten lkemuotojen osalta kaksi kolmasosaa uusista elinkaaren hallinnan formulaatioista oli sdellysti vapauttavia valmisteita. Elinkaaren hallinta on olennainen osa suurten lkeyritysten liiketoimintastrategiaa, ja sen trkeytt havainnollistettiin Coreg-tablettien tapausesimerkill.
Resumo:
Generation of raw materials for dry powder inhalers by different size reduction methods can be expected to influence physical and chemical properties of the powders. This can cause differences in particle size, size distribution, shape, crystalline properties, surface texture and energy. These physical properties of powders influence the behaviour of particles before and after inhalation. Materials with an amorphous surface have different surface energy compared to materials with crystalline surface. This can affect the adhesion and cohesion of particles. Changes in the surface nature of the drug particles results in a change in product performance. By stabilization of the raw materials the amorphous surfaces are converted into crystalline surfaces. The primary aim of the study was to investigate the influence of the surface properties of the inhalation particles on the quality of the product. The quality of the inhalation product is evaluated by measuring the fine particle dose (FPD). FDP is the total dose of particles with aerodynamic diameters smaller than 5,0 m. The secondary aim of this study was to achieve the target level of the FPD and the stability of the FPD. This study was also used to evaluate the importance of the stabilization of the inhalation powders. The study included manufacturing and analysing drug substance 200 g/dose inhalation powder batches using non-stabilized or stabilized raw materials. The inhaler formulation consisted of micronized drug substance, lactose <100m and micronized lactose <10m. The inhaler device was Easyhaler. Stabilization of the raw materials was done in different relative humidity, temperature and time. Surface properties of the raw materials were studied by dynamic vapour sorption, scanning electron microscopy and three-point nitrogen adsorption technique. Particle size was studied by laser diffraction particle size analyzer. Aerodynamic particle size distribution from inhalers was measured by new generation impactor. Stabilization of all three raw materials was successful. A clear difference between nonstabilized and stabilized raw materials was achieved for drug substance and lactose <10m. However for lactose <100m the difference wasnt as clear as wanted. The surface of the non-stabilized drug substance was more irregular and the particles had more roughness on the surface compared to the stabilized drug substances particles surface. The surface of the stabilized drug particles was more regular and smoother than non-stabilized. Even though a good difference between stabilized and non-stabilized raw materials was achieved, a clear evidence of the effect of the surface properties of the inhalation particles on the quality of the product was not observed. Stabilization of the raw materials didnt lead to a higher FPD. Possible explanations for the unexpected result might be too rough conditions in the stabilization of the drug substance or smaller than wanted difference in the degree of stabilization of the main component of the product <100m. Despite positive effects on the quality of the product were not seen there appears to be some evidence that stabilized drug substance results in smaller particle size of dry powder inhalers.