28 resultados para Isomorphic coordinate projections

em Helda - Digital Repository of University of Helsinki


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of an atomic decomposition was introduced by Coifman and Rochberg (1980) for weighted Bergman spaces on the unit disk. By the Riemann mapping theorem, functions in every simply connected domain in the complex plane have an atomic decomposition. However, a decomposition resulting from a conformal mapping of the unit disk tends to be very implicit and often lacks a clear connection to the geometry of the domain that it has been mapped into. The lattice of points, where the atoms of the decomposition are evaluated, usually follows the geometry of the original domain, but after mapping the domain into another this connection is easily lost and the layout of points becomes seemingly random. In the first article we construct an atomic decomposition directly on a weighted Bergman space on a class of regulated, simply connected domains. The construction uses the geometric properties of the regulated domain, but does not explicitly involve any conformal Riemann map from the unit disk. It is known that the Bergman projection is not bounded on the space L-infinity of bounded measurable functions. Taskinen (2004) introduced the locally convex spaces LV-infinity consisting of measurable and HV-infinity of analytic functions on the unit disk with the latter being a closed subspace of the former. They have the property that the Bergman projection is continuous from LV-infinity onto HV-infinity and, in some sense, the space HV-infinity is the smallest possible substitute to the space H-infinity of analytic functions. In the second article we extend the above result to a smoothly bounded strictly pseudoconvex domain. Here the related reproducing kernels are usually not known explicitly, and thus the proof of continuity of the Bergman projection is based on generalised Forelli-Rudin estimates instead of integral representations. The minimality of the space LV-infinity is shown by using peaking functions first constructed by Bell (1981). Taskinen (2003) showed that on the unit disk the space HV-infinity admits an atomic decomposition. This result is generalised in the third article by constructing an atomic decomposition for the space HV-infinity on a smoothly bounded strictly pseudoconvex domain. In this case every function can be presented as a linear combination of atoms such that the coefficient sequence belongs to a suitable Köthe co-echelon space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In order to improve and continuously develop the quality of pharmaceutical products, the process analytical technology (PAT) framework has been adopted by the US Food and Drug Administration. One of the aims of PAT is to identify critical process parameters and their effect on the quality of the final product. Real time analysis of the process data enables better control of the processes to obtain a high quality product. The main purpose of this work was to monitor crucial pharmaceutical unit operations (from blending to coating) and to examine the effect of processing on solid-state transformations and physical properties. The tools used were near-infrared (NIR) and Raman spectroscopy combined with multivariate data analysis, as well as X-ray powder diffraction (XRPD) and terahertz pulsed imaging (TPI). To detect process-induced transformations in active pharmaceutical ingredients (APIs), samples were taken after blending, granulation, extrusion, spheronisation, and drying. These samples were monitored by XRPD, Raman, and NIR spectroscopy showing hydrate formation in the case of theophylline and nitrofurantoin. For erythromycin dihydrate formation of the isomorphic dehydrate was critical. Thus, the main focus was on the drying process. NIR spectroscopy was applied in-line during a fluid-bed drying process. Multivariate data analysis (principal component analysis) enabled detection of the dehydrate formation at temperatures above 45°C. Furthermore, a small-scale rotating plate device was tested to provide an insight into film coating. The process was monitored using NIR spectroscopy. A calibration model, using partial least squares regression, was set up and applied to data obtained by in-line NIR measurements of a coating drum process. The predicted coating thickness agreed with the measured coating thickness. For investigating the quality of film coatings TPI was used to create a 3-D image of a coated tablet. With this technique it was possible to determine coating layer thickness, distribution, reproducibility, and uniformity. In addition, it was possible to localise defects of either the coating or the tablet. It can be concluded from this work that the applied techniques increased the understanding of physico-chemical properties of drugs and drug products during and after processing. They additionally provided useful information to improve and verify the quality of pharmaceutical dosage forms

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordination and juxtaposed sentences The object of this study is the examination of the relations between juxtaposed clauses in contemporary French. The matter in question is sentences which are composed of several clauses adjoined without a conjunction or other connector, as in: Je détournai les yeux, mon c ur se mit à battre. The aim of the study is to determine, which quality is the relation in these sentences and, on the other hand, what is the part of the coordination there. Furthermore, what is this relation of coordination, which, according to some grammars, manifests through a conjunction of coordination, but which, according to some others is marked in juxtaposed sentences through different features. The study is based on a corpus of written French from literary and journalistic text sources. Syntactic, semantic and textual properties in the clauses are discussed. The analysis points to differences so, it has been noted, in each case, if one of the clauses is affirmative and the other negative and if in the second clause, the subject has not been repeated. Also, an analysis has been made on the ground of the tense, mode, phrase structure type, and thematic structure, taking into account, in each case, if the clauses are identical or different. Punctuation has been one of the properties considered. The final aim has been to eliminate gradually, based on the partition of properties, subordinate sentences, so that only the hard core of coordinate sentences remains. In this way, the coordination could be defined similarly as the phoneme is defined as a group of distinctive features. The quantitative analyses have led to the conclusion that the sentences which, from a semantic point of view, are interpreted as coordinating, contain the least of these differences, while the sentences which can be considered as subordinating present the most of these differences. The conditions of coordination are, in that sense, hierarchical, so that the syntactic constraints have to make room for semantic, textual and cognitive factors. It is interesting to notice that everyone has the ability to produce correct coordinating structures and recognize incorrect coordinating structures. This can be explained by the human ability to categorize which has been widely researched in the semantic of prototype. The study suggests that coordination and subordination could be considered as prototypical cognitive categories based on different linguistic and pragmatic features.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From Steely Nation-State Superman to Conciliator of Economical Global Empire – A Psychohistory of Finnish Police Culture 1930-1997 My study concerns the way police culture has changed within the societal changes in Finnish society between 1930 and 1997. The method of my study was psycho-historical and post-structural analysis. The research was conducted by examining the psycho-historical plateaus traceable within Finnish police culture. I made a social diagnosis of the autopoietic relationship between the power-holders of Finnish society and the police (at various levels of hierarchical organization). According to police researcher John P. Crank, police culture should be understood as the cognitive processes behind the actions of the police. Among these processes are the values, beliefs, rituals, customs and advice which standardize their work and the common sense of policemen. According to Crank, police culture is defined by a mindset which thinks, judges and acts according to its evaluations filtered by its own preliminary comprehension. Police culture consists of all the unsaid assumptions of being a policeman, the organizational structures of police, official policies, unofficial ways of behaviour, forms of arrest, procedures of practice and different kinds of training habits, attitudes towards suspects and citizens, and also possible corruption. Police culture channels its members’ feelings and emotions. Crank says that police culture can be seen in how policemen express their feelings. He advises police researchers to ask themselves how it feels to be a member of the police. Ethos has been described as a communal frame for thought that guides one’s actions. According to sociologist Martti Grönfors, the Finnish mentality of the Protestant ethic is accentuated among Finnish policemen. The concept of ethos expresses very well the self-made mentality as an ethical tension which prevails in police work between communal belonging and individual freedom of choice. However, it is significant that it is a matter of the quality of relationships, and that the relationship is always tied to the context of the cultural history of dealing with one’s anxiety. According to criminologist Clifford Shearing, the values of police culture act as subterranean processes of the maintenance of social power in society. Policemen have been called microcosmic mediators, or street corner politicians. Robert Reiner argues that at the level of self-comprehension, policemen disparage the dimension of politics in their work. Reiner points out that all relationships which hold a dimension of power are political. Police culture has also been called a canteen culture. This idea expresses the day-to-day basis of the mentality of taking care of business which policing produces as a necessity for dealing with everyday hardships. According to police researcher Timo Korander, this figurative expression embodies the nature of police culture as a crew culture which is partly hidden from police chiefs who are at a different level. This multitude of standpoints depicts the diversity of police cultures. According to Reiner, one should not see police culture as one monolithic whole; instead one should assess it as the interplay of individuals negotiating with their environment and societal power networks. The cases analyzed formed different plateaus of study. The first plateau was the so-called ‘Rovaniemi arson’ case in the summer of 1930. The second plateau consisted of the examinations of alleged police assaults towards the Communists during the Finnish Continuation War of 1941 to 1944 and the threats that societal change after the war posed to Finnish Society. The third plateau was thematic. Here I investigated how using force towards police clients has changed culturally from the 1930s to the 1980s. The fourth plateau concerned with the material produced by the Security Police detectives traced the interaction between Soviet KGB agents and Finnish politicians during the long 1970s. The fifth plateau of larger changes in Finnish police culture then occurred during the 1980s as an aftermath of the former decade. The last, sixth plateau of changing relationships between policing and the national logic of action can be seen in the murder of two policemen in the autumn of 1997. My study shows that police culture has transformed from a “stone cold” steely fixed identity towards a more relational identity that tries to solve problems by negotiating with clients instead of using excessive force. However, in this process of change there is a traceable paradox in Finnish policing and police culture. On the one hand, policemen have, at the practical level, constructed their policing identity by protecting their inner self in their organizational role at work against the projections of anger and fear in society. On the other hand, however, they have had to safeguard themselves at the emotional level against the predominance of this same organizational role. Because of this dilemma they must simultaneously construct both a distance from their own role as police officers and the role of the police itself. This makes the task of policing susceptible to the political pressures of society. In an era of globalization, and after the heyday of the welfare state, this can produce heightened challenges for Finnish police culture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

From Arithmetic to Algebra. Changes in the skills in comprehensive school over 20 years. In recent decades we have emphasized the understanding of calculation in mathematics teaching. Many studies have found that better understanding helps to apply skills in new conditions and that the ability to think on an abstract level increases the transfer to new contexts. In my research I take into consideration competence as a matrix where content is in a horizontal line and levels of thinking are in a vertical line. The know-how is intellectual and strategic flexibility and understanding. The resources and limitations of memory have their effects on learning in different ways in different phases. Therefore both flexible conceptual thinking and automatization must be considered in learning. The research questions that I examine are what kind of changes have occurred in mathematical skills in comprehensive school over the last 20 years and what kind of conceptual thinking is demonstrated by students in this decade. The study consists of two parts. The first part is a statistical analysis of the mathematical skills and their changes over the last 20 years in comprehensive school. In the test the pupils did not use calculators. The second part is a qualitative analysis of the conceptual thinking of pupils in comprehensive school in this decade. The study shows significant differences in algebra and in some parts of arithmetic. The largest differences were detected in the calculation skills of fractions. In the 1980s two out of three pupils were able to complete tasks with fractions, but in the 2000s only one out of three pupils were able to do the same tasks. Also remarkable is that out of the students who could complete the tasks with fractions, only one out of three pupils was on the conceptual level in his/her thinking. This means that about 10% of pupils are able to understand the algebraic expression, which has the same isomorphic structure as the arithmetical expression. This finding is important because the ability to think innovatively is created when learning the basic concepts. Keywords: arithmetic, algebra, competence

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The United States is the world s single biggest market area, where the demand for graphic papers has increased by 80 % during the last three decades. However, during the last two decades there have been very big unpredictable changes in the graphic paper markets. For example, the consumption of newsprint started to decline from the late 1980 s, which was surprising compared to the historical consumption and projections. The consumption has declined since. The aim of this study was to see how magazine paper consumption will develop in the United States until 2030. The long-term consumption projection was made using mainly two methods. The first method was to use trend analysis to see how and if the consumption has changed since 1980. The second method was to use qualitative estimate. These estimates are then compared to the so-called classical model projections, which are usually mentioned and used in forestry literature. The purpose of the qualitative analysis is to study magazine paper end-use purposes and to analyze how and with what intensity the changes in society will effect to magazine paper consumption in the long-term. The framework of this study covers theories such as technology adaptation, electronic substitution, electronic publishing and Porter s threat of substitution. Because this study deals with markets, which have showed signs of structural change, a very substantial part of this study covers recent development and newest possible studies and statistics. The following were among the key findings of this study. Different end-uses have very different kinds of future. Electronic substitution is very likely in some end-use purposes, but not in all. Young people i.e. future consumers have very different manners, habits and technological opportunities than our parents did. These will have substantial effects in magazine paper consumption in the long-term. This study concludes to the fact that the change in magazine paper consumption is more likely to be gradual (evolutionary) than sudden collapse (revolutionary). It is also probable that the years of fast growing consumption of magazine papers are behind. Besides the decelerated growth, the consumption of magazine papers will decline slowly in the long-term. The decline will be faster depending on how far in the future we ll extend the study to.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gravitaation kvanttiteorian muotoilu on ollut teoreettisten fyysikkojen tavoitteena kvanttimekaniikan synnystä lähtien. Kvanttimekaniikan soveltaminen korkean energian ilmiöihin yleisen suhteellisuusteorian viitekehyksessä johtaa aika-avaruuden koordinaattien operatiiviseen ei-kommutoivuuteen. Ei-kommutoivia aika-avaruuden geometrioita tavataan myös avointen säikeiden säieteorioiden tietyillä matalan energian rajoilla. Ei-kommutoivan aika-avaruuden gravitaatioteoria voisi olla yhteensopiva kvanttimekaniikan kanssa ja se voisi mahdollistaa erittäin lyhyiden etäisyyksien ja korkeiden energioiden prosessien ei-lokaaliksi uskotun fysiikan kuvauksen, sekä tuottaa yleisen suhteellisuusteorian kanssa yhtenevän teorian pitkillä etäisyyksillä. Tässä työssä tarkastelen gravitaatiota Poincarén symmetrian mittakenttäteoriana ja pyrin yleistämään tämän näkemyksen ei-kommutoiviin aika-avaruuksiin. Ensin esittelen Poincarén symmetrian keskeisen roolin relativistisessa fysiikassa ja sen kuinka klassinen gravitaatioteoria johdetaan Poincarén symmetrian mittakenttäteoriana kommutoivassa aika-avaruudessa. Jatkan esittelemällä ei-kommutoivan aika-avaruuden ja kvanttikenttäteorian muotoilun ei-kommutoivassa aika-avaruudessa. Mittasymmetrioiden lokaalin luonteen vuoksi tarkastelen huolellisesti mittakenttäteorioiden muotoilua ei-kommutoivassa aika-avaruudessa. Erityistä huomiota kiinnitetään näiden teorioiden vääristyneeseen Poincarén symmetriaan, joka on ei-kommutoivan aika-avaruuden omaama uudentyyppinen kvanttisymmetria. Seuraavaksi tarkastelen ei-kommutoivan gravitaatioteorian muotoilun ongelmia ja niihin kirjallisuudessa esitettyjä ratkaisuehdotuksia. Selitän kuinka kaikissa tähänastisissa lähestymistavoissa epäonnistutaan muotoilla kovarianssi yleisten koordinaattimunnosten suhteen, joka on yleisen suhteellisuusteorian kulmakivi. Lopuksi tutkin mahdollisuutta yleistää vääristynyt Poincarén symmetria lokaaliksi mittasymmetriaksi --- gravitaation ei-kommutoivan mittakenttäteorian saavuttamisen toivossa. Osoitan, että tällaista yleistystä ei voida saavuttaa vääristämällä Poincarén symmetriaa kovariantilla twist-elementillä. Näin ollen ei-kommutoivan gravitaation ja vääristyneen Poincarén symmetrian tutkimuksessa tulee jatkossa keskittyä muihin lähestymistapoihin.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This masters thesis explores some of the most recent developments in noncommutative quantum field theory. This old theme, first suggested by Heisenberg in the late 1940s, has had a renaissance during the last decade due to the firmly held belief that space-time becomes noncommutative at small distances and also due to the discovery that string theory in a background field gives rise to noncommutative field theory as an effective low energy limit. This has led to interesting attempts to create a noncommutative standard model, a noncommutative minimal supersymmetric standard model, noncommutative gravity theories etc. This thesis reviews themes and problems like those of UV/IR mixing, charge quantization, how to deal with the non-commutative symmetries, how to solve the Seiberg-Witten map, its connection to fluid mechanics and the problem of constructing general coordinate transformations to obtain a theory of noncommutative gravity. An emphasis has been put on presenting both the group theoretical results and the string theoretical ones, so that a comparison of the two can be made.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Breast cancer is the most common cancer in women in the western countries. Approximately two-thirds of breast cancer tumours are hormone dependent, requiring estrogens to grow. Estrogens are formed in the human body via a multistep route starting from cholesterol. The final steps in the biosynthesis include the CYP450 aromatase enzyme, converting the male hormones androgens (preferred substrate androstenedione ASD) into estrogens(estrone E1), and the 17beta-HSD1 enzyme, converting the biologically less active E1 into the active hormone 17beta-hydroxyestradiol E2. E2 is bound to the nuclear estrogen receptors causing a cascade of biochemical reactions leading to cell proliferation in normal tissue, and to tumour growth in cancer tissue. Aromatase and 17beta-HSD1 are expressed in or near the breast tumour, locally providing the tissue with estrogens. One approach in treating hormone dependent breast tumours is to block the local estrogen production by inhibiting these two enzymes. Aromatase inhibitors are already on the market in treating breast cancer, despite the lack of an experimentally solved structure. The structure of 17beta-HSD1, on the other hand, has been solved, but no commercial drugs have emerged from the drug discovery projects reported in the literature. Computer-assisted molecular modelling is an invaluable tool in modern drug design projects. Modelling techniques can be used to generate a model of the target protein and to design novel inhibitors for them even if the target protein structure is unknown. Molecular modelling has applications in predicting the activities of theoretical inhibitors and in finding possible active inhibitors from a compound database. Inhibitor binding at atomic level can also be studied with molecular modelling. To clarify the interactions between the aromatase enzyme and its substrate and inhibitors, we generated a homology model based on a mammalian CYP450 enzyme, rabbit progesterone 21-hydroxylase CYP2C5. The model was carefully validated using molecular dynamics simulations (MDS) with and without the natural substrate ASD. Binding orientation of the inhibitors was based on the hypothesis that the inhibitors coordinate to the heme iron, and were studied using MDS. The inhibitors were dietary phytoestrogens, which have been shown to reduce the risk for breast cancer. To further validate the model, the interactions of a commercial breast cancer drug were studied with MDS and ligand–protein docking. In the case of 17beta-HSD1, a 3D QSAR model was generated on the basis of MDS of an enzyme complex with active inhibitor and ligand–protein docking, employing a compound library synthesised in our laboratory. Furthermore, four pharmacophore hypotheses with and without a bound substrate or an inhibitor were developed and used in screening a commercial database of drug-like compounds. The homology model of aromatase showed stable behaviour in MDS and was capable of explaining most of the results from mutagenesis studies. We were able to identify the active site residues contributing to the inhibitor binding, and explain differences in coordination geometry corresponding to the inhibitory activity. Interactions between the inhibitors and aromatase were in agreement with the mutagenesis studies reported for aromatase. Simulations of 17beta-HSD1 with inhibitors revealed an inhibitor binding mode with hydrogen bond interactions previously not reported, and a hydrophobic pocket capable of accommodating a bulky side chain. Pharmacophore hypothesis generation, followed by virtual screening, was able to identify several compounds that can be used in lead compound generation. The visualisation of the interaction fields from the QSAR model and the pharmacophores provided us with novel ideas for inhibitor development in our drug discovery project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A smooth map is said to be stable if small perturbations of the map only differ from the original one by a smooth change of coordinates. Smoothly stable maps are generic among the proper maps between given source and target manifolds when the source and target dimensions belong to the so-called nice dimensions, but outside this range of dimensions, smooth maps cannot generally be approximated by stable maps. This leads to the definition of topologically stable maps, where the smooth coordinate changes are replaced with homeomorphisms. The topologically stable maps are generic among proper maps for any dimensions of source and target. The purpose of this thesis is to investigate methods for proving topological stability by constructing extremely tame (E-tame) retractions onto the map in question from one of its smoothly stable unfoldings. In particular, we investigate how to use E-tame retractions from stable unfoldings to find topologically ministable unfoldings for certain weighted homogeneous maps or germs. Our first results are concerned with the construction of E-tame retractions and their relation to topological stability. We study how to construct the E-tame retractions from partial or local information, and these results form our toolbox for the main constructions. In the next chapter we study the group of right-left equivalences leaving a given multigerm f invariant, and show that when the multigerm is finitely determined, the group has a maximal compact subgroup and that the corresponding quotient is contractible. This means, essentially, that the group can be replaced with a compact Lie group of symmetries without much loss of information. We also show how to split the group into a product whose components only depend on the monogerm components of f. In the final chapter we investigate representatives of the E- and Z-series of singularities, discuss their instability and use our tools to construct E-tame retractions for some of them. The construction is based on describing the geometry of the set of points where the map is not smoothly stable, discovering that by using induction and our constructional tools, we already know how to construct local E-tame retractions along the set. The local solutions can then be glued together using our knowledge about the symmetry group of the local germs. We also discuss how to generalize our method to the whole E- and Z- series.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study comprises an introductory section and three essays analysing Russia's economic transition from the early 1990s up to the present. The papers present a combination of both theoretical and empirical analysis on some of the key issues Russia has faced during its somewhat troublesome transformation from state-controlled command economy to market-based economy. The first essay analyses fiscal competition for mobile capital between identical regions in a transition country. A standard tax competition framework is extended to account for two features of a transition economy: the presence of two sectors, old and new, which differ in productivity; and a non-benevolent regional decision-maker. It is shown that in very early phase of transition, when the old sector clearly dominates, consumers in a transition economy may be better off in a competitive equilibrium. Decision-makers, on the other hand, will prefer to coordinate their fiscal policies. The second essay uses annual data for 1992-2003 to examine income dispersion and convergence across 76 Russian regions. Wide disparities in income levels have indeed emerged during the transition period. Dispersion has increased most among the initially better-off regions, whereas for the initially poorer regions no clear trend of divergence or convergence could be established. Further, some - albeit not highly robust - evidence was found of both unconditional and conditional convergence, especially among the initially richer regions. Finally, it is observed that there is much less evidence of convergence after the economic crisis of 1998. The third essay analyses industrial firms' engagement in provision of infrastructure services, such as heating, electricity and road maintenance. Using a unique dataset of 404 large and medium-sized industrial enterprises in 40 regions of Russia, the essay examines public infrastructure provision by Russian industrial enterprises. It is found that to a large degree engagement in infrastructure provision, as proxied by district heating production, is a Soviet legacy. Secondly, firms providing district heating to users outside their plant area are more likely to have close and multidimensional relations with the local public sector.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The juvenile sea squirt wanders through the sea searching for a suitable rock or hunk of coral to cling to and make its home for life. For this task it has a rudimentary nervous system. When it finds its spot and takes root, it doesn't need its brain any more so it eats it. It's rather like getting tenure. Daniel C. Dennett (from Consciousness Explained, 1991) The little sea squirt needs its brain for a task that is very simple and short. When the task is completed, the sea squirt starts a new life in a vegetative state, after having a nourishing meal. The little brain is more tightly structured than our massive primate brains. The number of neurons is exact, no leeway in neural proliferation is tolerated. Each neuroblast migrates exactly to the correct position, and only a certain number of connections with the right companions is allowed. In comparison, growth of a mammalian brain is a merry mess. The reason is obvious: Squirt brain needs to perform only a few, predictable functions, before becoming waste. The more mobile and complex mammals engage their brains in tasks requiring quick adaptation and plasticity in a constantly changing environment. Although the regulation of nervous system development varies between species, many regulatory elements remain the same. For example, all multicellular animals possess a collection of proteoglycans (PG); proteins with attached, complex sugar chains called glycosaminoglycans (GAG). In development, PGs participate in the organization of the animal body, like in the construction of parts of the nervous system. The PGs capture water with their GAG chains, forming a biochemically active gel at the surface of the cell, and in the extracellular matrix (ECM). In the nervous system, this gel traps inside it different molecules: growth factors and ECM-associated proteins. They regulate the proliferation of neural stem cells (NSC), guide the migration of neurons, and coordinate the formation of neuronal connections. In this work I have followed the role of two molecules contributing to the complexity of mammalian brain development. N-syndecan is a transmembrane heparan sulfate proteoglycan (HSPG) with cell signaling functions. Heparin-binding growth-associated molecule (HB-GAM) is an ECM-associated protein with high expression in the perinatal nervous system, and high affinity to HS and heparin. N-syndecan is a receptor for several growth factors and for HB-GAM. HB-GAM induces specific signaling via N-syndecan, activating c-Src, calcium/calmodulin-dependent serine protein kinase (CASK) and cortactin. By studying the gene knockouts of HB-GAM and N-syndecan in mice, I have found that HB-GAM and N-syndecan are involved as a receptor-ligand-pair in neural migration and differentiation. HB-GAM competes with the growth factors fibriblast growth factor (FGF)-2 and heparin-binding epidermal growth factor (HB-EGF) in HS-binding, causing NSCs to stop proliferation and to differentiate, and affects HB-EGF-induced EGF receptor (EGFR) signaling in neural cells during migration. N-syndecan signaling affects the motility of young neurons, by boosting EGFR-mediated cell migration. In addition, these two receptors form a complex at the surface of the neurons, probably creating a motility-regulating structure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The cells of multicellular organisms have differentiated to carry out specific functions that are often accompanied by distinct cell morphology. The actin cytoskeleton is one of the key regulators of cell shape subsequently controlling multiple cellular events including cell migration, cell division, endo- and exocytosis. A large set of actin regulating proteins has evolved to achieve and tightly coordinate this wide range of functions. Some actin regulator proteins have so-called house keeping roles and are essential for all eukaryotic cells, but some have evolved to meet the requirements of more specialized cell-types found in higher organisms enabling complex functions of differentiated organs, such as liver, kidney and brain. Often processes mediated by the actin cytoskeleton, like formation of cellular protrusions during cell migration, are intimately linked to plasma membrane remodeling. Thus, a close cooperation between these two cellular compartments is necessary, yet not much is known about the underlying molecular mechanisms. This study focused on a vertebrate-specific protein called missing-in-metastasis (MIM), which was originally characterized as a metastasis suppressor of bladder cancer. We demonstrated that MIM regulates the dynamics of actin cytoskeleton via its WH2 domain, and is expressed in a cell-type specific manner. Interestingly, further examination showed that the IM-domain of MIM displays a novel membrane tubulation activity, which induces formation of filopodia in cells. Following studies demonstrated that this membrane deformation activity is crucial for cell protrusions driven by MIM. In mammals, there are five members of IM-domain protein family. Functions and expression patterns of these family members have remained poorly characterized. To understand the physiological functions of MIM, we generated MIM knockout mice. MIM-deficient mice display no apparent developmental defects, but instead suffer from progressive renal disease and increased susceptibility to tumors. This indicates that MIM plays a role in the maintenance of specific physiological functions associated with distinct cell morphologies. Taken together, these studies implicate MIM both in the regulation of the actin cytoskeleton and the plasma membrane. Our results thus suggest that members of MIM/IRSp53 protein family coordinate the actin cytoskeleton:plasma membrane interface to control cell and tissue morphogenesis in multicellular organisms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large carnivore populations are currently recovering from past extirpation efforts and expanding back into their original habitats. At the same time human activities have resulted in very few wilderness areas left with suitable habitats and size large enough to maintain populations of large carnivores without human contact. Consequently the long-term future of large carnivores depends on their successful integration into landscapes where humans live. Thus, understanding their behaviour and interaction with surrounding habitats is of utmost importance in the development of management strategies for large carnivores. This applies also to brown bears (Ursus arctos) that were almost exterminated from Scandinavia and Finland at the turn of the century, but are now expanding their range with the current population estimates being approximately 2600 bears in Scandinavia and 840 in Finland. This thesis focuses on the large-scale habitat use and population dynamics of brown bears in Scandinavia with the objective to develop modelling approaches that support the management of bear populations. Habitat analysis shows that bear home ranges occur mainly in forested areas with a low level of human influence relative to surrounding areas. Habitat modelling based on these findings allows identification and quantification of the potentially suitable areas for bears in Scandinavia. Additionally, this thesis presents novel improvements to home range estimation that enable realistic estimates of the effective area required for the bears to establish a home range. This is achieved through fitting to the radio-tracking data to establish the amount of temporal autocorrelation and the proportion of time spent in different habitat types. Together these form a basis for the landscape-level management of the expanding population. Successful management of bears requires also assessment of the consequences of harvest on the population viability. An individual-based simulation model, accounting for the sexually selected infanticide, was used to investigate the possibility of increasing the harvest using different hunting strategies, such as trophy harvest of males. The results indicated that the population can sustain twice the current harvest rate. However, harvest should be changed gradually while carefully monitoring the population growth as some effects of increased harvest may manifest themselves only after a time-delay. The results and methodological improvements in this thesis can be applied to the Finnish bear population and to other large carnivores. They provide grounds for the further development of spatially-realistic management-oriented models of brow bear dynamics that can make projections of the future distribution of bears while accounting for the development of human activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate and stable time series of geodetic parameters can be used to help in understanding the dynamic Earth and its response to global change. The Global Positioning System, GPS, has proven to be invaluable in modern geodynamic studies. In Fennoscandia the first GPS networks were set up in 1993. These networks form the basis of the national reference frames in the area, but they also provide long and important time series for crustal deformation studies. These time series can be used, for example, to better constrain the ice history of the last ice age and the Earth s structure, via existing glacial isostatic adjustment models. To improve the accuracy and stability of the GPS time series, the possible nuisance parameters and error sources need to be minimized. We have analysed GPS time series to study two phenomena. First, we study the refraction in the neutral atmosphere of the GPS signal, and, second, we study the surface loading of the crust by environmental factors, namely the non-tidal Baltic Sea, atmospheric load and varying continental water reservoirs. We studied the atmospheric effects on the GPS time series by comparing the standard method to slant delays derived from a regional numerical weather model. We have presented a method for correcting the atmospheric delays at the observational level. The results show that both standard atmosphere modelling and the atmospheric delays derived from a numerical weather model by ray-tracing provide a stable solution. The advantage of the latter is that the number of unknowns used in the computation decreases and thus, the computation may become faster and more robust. The computation can also be done with any processing software that allows the atmospheric correction to be turned off. The crustal deformation due to loading was computed by convolving Green s functions with surface load data, that is to say, global hydrology models, global numerical weather models and a local model for the Baltic Sea. The result was that the loading factors can be seen in the GPS coordinate time series. Reducing the computed deformation from the vertical time series of GPS coordinates reduces the scatter of the time series; however, the long term trends are not influenced. We show that global hydrology models and the local sea surface can explain up to 30% of the GPS time series variation. On the other hand atmospheric loading admittance in the GPS time series is low, and different hydrological surface load models could not be validated in the present study. In order to be used for GPS corrections in the future, both atmospheric loading and hydrological models need further analysis and improvements.