40 resultados para Artificial source
Resumo:
Tässä pro gradu -tutkielmassa käsittelen lähde- ja kohdetekstikeskeisyyttä näytelmäkääntämisessä. Tutkimuskohteina olivat käännösten sanasto, syntaksi, näyttämötekniikka, kielikuvat, sanaleikit, runomitta ja tyyli. Tutkimuksen tarkoituksena oli selvittää, näkyykö teoreettinen painopisteen siirtyminen lähdetekstikeskeisyydestä kohdetekstikeskeisyyteen suomenkielisessä näytelmäkääntämisessä. Oletuksena oli, että siirtyminen näkyy käytetyissä käännösstrategioissa. Tutkimuksen teoriaosuudessa käsitellään ensin lähde- ja kohdetekstikeskeisiä käännösteorioita. Ensin esitellään kaksi lähdetekstikeskeistä teoriaa, jotka ovat Catfordin (1965) muodollinen vastaavuus ja Nidan (1964) dynaaminen ekvivalenssi. Kohdetekstikeskeisistä teorioista käsitellään Touryn (1980) ja Newmarkin (1981) teoreettisia näkemyksiä sekä Reiss ja Vermeerin (1986) esittelemää skopos-teoriaa. Vieraannuttamisen ja kotouttamisen periaatteet esitellään lyhyesti. Teoriaosuudessa käsitellään myös näytelmäkääntämistä, William Shakespearen kieltä ja siihen liittyviä käännösongelmia. Lisäksi esittelen lyhyesti Shakespearen kääntämistä Suomessa ja Julius Caesarin neljä kääntäjää. Tutkimuksen materiaalina oli neljä Shakespearen Julius Caesar –näytelmän suomennosta, joista Paavo Cajanderin käännös on julkaistu vuonna 1883, Eeva-Liisa Mannerin vuonna 1983, Lauri Siparin vuonna 2006 ja Jarkko Laineen vuonna 2007. Analyysissa käännöksiä verrattiin lähdetekstiin ja toisiinsa ja vertailtiin kääntäjien tekemiä käännösratkaisuja. Tulokset olivat oletuksen mukaisia. Lähdetekstikeskeisiä käännösstrategioita oli käytetty uusissa käännöksissä vähemmän kuin vanhemmissa. Kohdetekstikeskeiset strategiat erosivat huomattavasti toisistaan ja uusinta käännöstä voi sanoa adaptaatioksi. Jatkotutkimuksissa tulisi materiaali laajentaa koskemaan muitakin Shakespearen näytelmien suomennoksia. Eri aikakausien käännöksiä tulisi verrata keskenään ja toisiinsa, jotta voitaisiin luotettavasti kuvata muutosta lähde- ja kohdetekstikeskeisten käännösstrategioiden käytössä ja eri aikakausien tyypillisten strategioiden kartoittamiseksi.
Resumo:
Persistent luminescence materials can store energy from solar radiation or artificial lighting and release it over a period of several hours without a continuous excitation source. These materials are widely used to improve human safety in emergency and traffic signalization. They can also be utilized in novel applications including solar cells, medical diagnostics, radiation detectors and structural damage sensors. The development of these materials is currently based on methods based on trial and error. The tailoring of new materials is also hindered by the lack of knowledge on the role of their intrinsic and extrinsic lattice defects in the appropriate mechanisms. The goal of this work was to clarify the persistent luminescence mechanisms by combining ab initio density functional theory (DFT) calculations with selected experimental methods. The DFT approach enables a full control of both the nature of the defects and their locations in the host lattice. The materials studied in the present work, the distrontium magnesium disilicate (Sr2MgSi2O7) and strontium aluminate (SrAl2O4) are among the most efficient persistent luminescence hosts when doped with divalent europium Eu2+ and co-doped with trivalent rare earth ions R3+ (R: Y, La-Nd, Sm, Gd-Lu). The polycrystalline materials were prepared with the solid state method and their structural and phase purity was confirmed by X-ray powder diffraction. Their local crystal structure was studied by high-resolution transmission electron microscopy. The crystal and electronic structure of the nondoped as well as Eu2+, R2+/3+ and other defect containing materials were studied using DFT calculations. The experimental trap depths were obtained using thermoluminescence (TL) spectroscopy. The emission and excitation of Sr2MgSi2O7:Eu2+,Dy3+ were also studied. Significant modifications in the local crystal structure due to the Eu2+ ion and lattice defects were found by the experimental and DFT methods. The charge compensation effects induced by the R3+ co-doping further increased the number of defects and distortions in the host lattice. As for the electronic structure of Sr2MgSi2O7 and SrAl2O4, the experimental band gap energy of the host materials was well reproduced by the calculations. The DFT calculated Eu2+ and R2+/3+ 4fn as well as 4fn-15d1 ground states in the Sr2MgSi2O7 band structure provide an independent verification for an empirical model which is constructed using rather sparse experimental data for the R3+ and especially the R2+ ions. The intrinsic and defect induced electron traps were found to act together as energy storage sites contributing to the materials’ efficient persistent luminescence. The calculated trap energy range agreed with the trap structure of Sr2MgSi2O7 obtained using TL measurements. More experimental studies should be carried out for SrAl2O4 to compare with the DFT calculations. The calculated and experimental results show that the electron traps created by both the rare earth ions and vacancies are modified due to the defect aggregation and charge compensation effects. The relationships between this modification and the energy storage properties of the solid state materials are discussed.
Resumo:
The development of bioenergy on the basis of wood fuels has received considerable attention in the last decades. The combination of large forest resources and reliance on fossil fuels makes the issue of wood chips usage in Russia an actual topic for the analysis. The main objective of this study is to disclose the current state and perspectives for the production of wood chips and their usage as a source of energy in the North-West of Russia. The study utilizes an integrated approach to explore the market of wood chips on the basis of comprehensive analysis of documentation and expert opinions. The analysis of wood chips market was performed for eight regions of the North-West district of Russia within two major dimensions: its current state and perspectives in the nearest five years. The results of the study show a comprehensive picture of the wood chips market, including the potential for wood chips production, the specific features of production and consumption and the perspectives for the market development within the regions of the North-West district of Russia. The study demonstrated that the market of wood chips is underdeveloped in the North-West of Russia. The findings of the work may be used by forest companies for the strategic planning.
Resumo:
Cyanobacteria are a diverse group of oxygenic photosynthetic bacteria that inhabit in a wide range of environments. They are versatile and multifaceted organisms with great possibilities for different biotechnological applications. For example, cyanobacteria produce molecular hydrogen (H2), which is one of the most important alternatives for clean and sustainable energy. Apart from being beneficial, cyanobacteria also possess harmful characteristics and may become a source of threat to human health and other living organisms, as they are able to form surface blooms that are producing a variety of toxic or bioactive compounds. The University of Helsinki Culture Collection (UHCC) maintains around 1,000 cyanobacterial strains representing a large number of genera and species isolated from the Baltic Sea and Finnish lakes. The culture collection covers different life forms such as unicellular and filamentous, N2-fixing and non-N2-fixing strains, and planktonic and benthic cyanobacteria. In this thesis, the UHCC has been screened to identify potential strains for sustainable biohydrogen production and also for strains that produce compounds modifying the bioenergetic pathways of other cyanobacteria or terrestrial plants. Among the 400 cyanobacterial strains screened so far, ten were identified as high H2-producing strains. The enzyme systems involved in H2 metabolism of cyanobacteria were analyzed using the Southern hybridization approach. This revealed the presence of the enzyme nitrogenase in all strains tested, while none of them are likely to have contained alternative nitrogenases. All the strains tested, except for two Calothrix strains, XSPORK 36C and XSPORK 11A, were suggested to contain both uptake and bidirectional hydrogenases. Moreover, 55 methanol extracts of various cyanobacterial strains were screened to identify potent bioactive compounds affecting the photosynthetic apparatus of the model cyanobacterium, Synechocystis PCC 6803. The extract from Nostoc XPORK 14A was the only one that modified the photosynthetic machinery and dark respiration. The compound responsible for this effect was identified, purified, and named M22. M22 demonstrated a dual-action mechanism: production of reactive oxygen species (ROS) under illumination and an unknown mechanism that also prevailed in the dark. During summer, the Baltic Sea is occupied by toxic blooms of Nodularia spumigena (hereafter referred to as N. spumigena), which produces a hepatotoxin called nodularin. Long-term exposure of the terrestrial plant spinach to nodularin was studied. Such treatment resulted in inhibition of growth and chlorosis of the leaves. Moreover, the activity and amount of mitochondrial electron transfer complexes increased in the leaves exposed to nodularin-containing extract, indicating upregulation of respiratory reactions, whereas no marked changes were detected in the structure or function of the photosynthetic machinery. Nodularin-exposed plants suffered from oxidative stress, evidenced by oxidative modifications of various proteins. Plants initiated strategies to combat the stress by increasing the levels of alpha-tocopherol, mitochondrial alternative oxidase (AOX), and mitochondrial ascorbate peroxidase (mAPX).
Resumo:
The environmental aspect of corporate social responsibility (CSR) expressed through the process of the EMS implementation in the oil and gas companies is identified as the main subject of this research. In the theoretical part, the basic attention is paid to justification of a link between CSR and environmental management. The achievement of sustainable competitive advantage as a result of environmental capital growth and inclusion of the socially responsible activities in the corporate strategy is another issue that is of special significance here. Besides, two basic forms of environmental management systems (environmental decision support systems and environmental information management systems) are explored and their role in effective stakeholder interaction is tackled. The most crucial benefits of EMS are also analyzed to underline its importance as a source of sustainable development. Further research is based on the survey of 51 sampled oil and gas companies (both publicly owned and state owned ones) originated from different countries all over the world and providing reports on sustainability issues in the open access. To analyze their approach to sustainable development, a specifically designed evaluation matrix with 37 indicators developed in accordance with the General Reporting Initiative (GRI) guidelines for non-financial reporting was prepared. Additionally, the quality of environmental information disclosure was measured on the basis of a quality – quantity matrix. According to results of research, oil and gas companies prefer implementing reactive measures to the costly and knowledge-intensive proactive techniques for elimination of the negative environmental impacts. Besides, it was identified that the environmental performance disclosure is mostly rather limited, so that the quality of non-financial reporting can be judged as quite insufficient. In spite of the fact that most of the oil and gas companies in the sample claim the EMS to be embedded currently in their structure, they often do not provide any details for the process of their implementation. As a potential for the further development of EMS, author mentions possible integration of their different forms in a single entity, extension of existing structure on the basis of consolidation of the structural and strategic precautions as well as development of a unified certification standard instead of several ones that exist today in order to enhance control on the EMS implementation.
Resumo:
The aim of this master’s thesis is to study how Agile method (Scrum) and open source software are utilized to produce software for a flagship product in a complex production environment. The empirical case and the used artefacts are taken from the Nokia MeeGo N9 product program, and from the related software program, called as the Harmattan. The single research case is analysed by using a qualitative method. The Grounded Theory principles are utilized, first, to find out all the related concepts from artefacts. Second, these concepts are analysed, and finally categorized to a core category and six supported categories. The result is formulated as the operation of software practices conceivable in circumstances, where the accountable software development teams and related context accepts a open source software nature as a part of business vision and the whole organization supports the Agile methods.
Resumo:
Open source and open source software development have been interesting phenomena during the past decade. Traditional business models do not apply with open source, where the actual product is free. However, it is possible to make business with open source, even successfully, but the question is: how? The aim of this study is to find the key factors of successfully making business out of commercial open source software development. The task is achieved by finding the factors that influence open source projects, finding the relation between those factors, and find out why some factors explain the success more than others. The literature review concentrates first on background of open innovation, open source and open source software. Then business models, critical success factors and success measures are examined. Based on existing literature a framework was created. The framework contains categorized success factors that influence software projects in general as well as open source software projects. The main categories of success factors in software business are divided into community management, technology management, project management and market management. In order to find out which of the factors based on the existing literature are the most critical, empirical research was done by conducting unstructured personal interviews. The main finding based on the interviews is that the critical success factors in open source software business do not differ from those in traditional software business or in fact from those in any other business. Some factors in the framework came out in the interviews that can be considered as key factors: establishing and communicating hierarchy (community management), localization (technology management), good license know-how and IPR management (project management), and effective market management (market management). The critical success factors according to the interviewees are not listed in the framework: low price, good product and good business model development.
Resumo:
Ceramides comprise a class of sphingolipids that exist only in small amounts in cellular membranes, but which have been associated with important roles in cellular signaling processes. The influences that ceramides have on the physical properties of bilayer membranes reach from altered thermodynamical behavior to significant impacts on the molecular order and lateral distribution of membrane lipids. Along with the idea that the membrane physical state could influence the physiological state of a cell, the membrane properties of ceramides have gained increasing interest. Therefore, membrane phenomena related to ceramides have become a subject of intense study both in cellular as well as in artificial membranes. Artificial bilayers, the so called model membranes, are substantially simpler in terms of contents and spatio-temporal variation than actual cellular membranes, and can be used to give detailed information about the properties of individual lipid species in different environments. This thesis focuses on investigating how the different parts of the ceramide molecule, i.e., the N-linked acyl chain, the long-chain sphingoid base and the membrane-water interface region, govern the interactions and lateral distribution of these lipids in bilayer membranes. With the emphasis on ceramide/sphingomyelin(SM)-interactions, the relevance of the size of the SMhead group for the interaction was also studied. Ceramides with methylbranched N-linked acyl chains, varying length sphingoid bases, or methylated 2N (amide-nitrogen) and 3O (C3-hydroxyl) at the interface region, as well as SMs with decreased head group size, were synthesized and their bilayer properties studied by calorimetric and fluorescence spectroscopic techniques. In brief, the results showed that the packing of the ceramide acyl chains was more sensitive to methyl-branching in the mid part than in the distal end of the N-linked chain, and that disrupting the interfacial structure at the amide-nitrogen, as opposed to the C3-hydroxyl, had greater effect on the interlipid interactions of ceramides. Interestingly, it appeared that the bilayer properties of ceramides could be more sensitive to small alterations in the length of the long-chain base than what was previously reported for the N-linked acyl chain. Furthermore, the data indicated that the SM-head group does not strongly influence the interactions between SMs and ceramides. The results in this thesis illustrate the pivotal role of some essential parts of the ceramide molecules in determining their bilayer properties. The thesis provides increased understanding of the molecular aspects of ceramides that possibly affect their functions in biological membranes, and could relate to distinct effects on cell physiology.
Resumo:
Biokuvainformatiikan kehittäminen – mikroskopiasta ohjelmistoratkaisuihin – sovellusesimerkkinä α2β1-integriini Kun ihmisen genomi saatiin sekvensoitua vuonna 2003, biotieteiden päätehtäväksi tuli selvittää eri geenien tehtävät, ja erilaisista biokuvantamistekniikoista tuli keskeisiä tutkimusmenetelmiä. Teknologiset kehitysaskeleet johtivat erityisesti fluoresenssipohjaisten valomikroskopiatekniikoiden suosion räjähdysmäiseen kasvuun, mutta mikroskopian tuli muuntua kvalitatiivisesta tieteestä kvantitatiiviseksi. Tämä muutos synnytti uuden tieteenalan, biokuvainformatiikan, jonka on sanottu mahdollisesti mullistavan biotieteet. Tämä väitöskirja esittelee laajan, poikkitieteellisen työkokonaisuuden biokuvainformatiikan alalta. Väitöskirjan ensimmäinen tavoite oli kehittää protokollia elävien solujen neliulotteiseen konfokaalimikroskopiaan, joka oli yksi nopeimmin kasvavista biokuvantamismenetelmistä. Ihmisen kollageenireseptori α2β1-integriini, joka on tärkeä molekyyli monissa fysiologisissa ja patologisissa prosesseissa, oli sovellusesimerkkinä. Työssä saavutettiin selkeitä visualisointeja integriinien liikkeistä, yhteenkeräytymisestä ja solun sisään siirtymisestä, mutta työkaluja kuvainformaation kvantitatiiviseen analysointiin ei ollut. Väitöskirjan toiseksi tavoitteeksi tulikin tällaiseen analysointiin soveltuvan tietokoneohjelmiston kehittäminen. Samaan aikaan syntyi biokuvainformatiikka, ja kipeimmin uudella alalla kaivattiin erikoistuneita tietokoneohjelmistoja. Tämän väitöskirjatyön tärkeimmäksi tulokseksi muodostui näin ollen BioImageXD, uudenlainen avoimen lähdekoodin ohjelmisto moniulotteisten biokuvien visualisointiin, prosessointiin ja analysointiin. BioImageXD kasvoi yhdeksi alansa suurimmista ja monipuolisimmista. Se julkaistiin Nature Methods -lehden biokuvainformatiikkaa käsittelevässä erikoisnumerossa, ja siitä tuli tunnettu ja laajalti käytetty. Väitöskirjan kolmas tavoite oli soveltaa kehitettyjä menetelmiä johonkin käytännönläheisempään. Tehtiin keinotekoisia piidioksidinanopartikkeleita, joissa oli "osoitelappuina" α2β1-integriinin tunnistavia vasta-aineita. BioImageXD:n avulla osoitettiin, että nanopartikkeleilla on potentiaalia lääkkeiden täsmäohjaussovelluksissa. Tämän väitöskirjatyön yksi perimmäinen tavoite oli edistää uutta ja tuntematonta biokuvainformatiikan tieteenalaa, ja tämä tavoite saavutettiin erityisesti BioImageXD:n ja sen lukuisten julkaistujen sovellusten kautta. Väitöskirjatyöllä on merkittävää potentiaalia tulevaisuudessa, mutta biokuvainformatiikalla on vakavia haasteita. Ala on liian monimutkainen keskimääräisen biolääketieteen tutkijan hallittavaksi, ja alan keskeisin elementti, avoimen lähdekoodin ohjelmistokehitystyö, on aliarvostettu. Näihin seikkoihin tarvitaan useita parannuksia,
Resumo:
In this master’s thesis, wind speeds and directions were modeled with the aim of developing suitable models for hourly, daily, weekly and monthly forecasting. Artificial Neural Networks implemented in MATLAB software were used to perform the forecasts. Three main types of artificial neural network were built, namely: Feed forward neural networks, Jordan Elman neural networks and Cascade forward neural networks. Four sub models of each of these neural networks were also built, corresponding to the four forecast horizons, for both wind speeds and directions. A single neural network topology was used for each of the forecast horizons, regardless of the model type. All the models were then trained with real data of wind speeds and directions collected over a period of two years in the municipal region of Puumala in Finland. Only 70% of the data was used for training, validation and testing of the models, while the second last 15% of the data was presented to the trained models for verification. The model outputs were then compared to the last 15% of the original data, by measuring the mean square errors and sum square errors between them. Based on the results, the feed forward networks returned the lowest generalization errors for hourly, weekly and monthly forecasts of wind speeds; Jordan Elman networks returned the lowest errors when used for forecasting of daily wind speeds. Cascade forward networks gave the lowest errors when used for forecasting daily, weekly and monthly wind directions; Jordan Elman networks returned the lowest errors when used for hourly forecasting. The errors were relatively low during training of the models, but shot up upon simulation with new inputs. In addition, a combination of hyperbolic tangent transfer functions for both hidden and output layers returned better results compared to other combinations of transfer functions. In general, wind speeds were more predictable as compared to wind directions, opening up opportunities for further research into building better models for wind direction forecasting.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014