21 resultados para SOA approaches
em Helda - Digital Repository of University of Helsinki
Resumo:
Approximately 125 prehistoric rock paintings have been found in the modern territory of Finland. The paintings were done with red ochre and are almost without exception located on steep lakeshore cliffs associated with ancient water routes. Most of the sites are found in the central and eastern parts of the country, especially on the shores of Lakes Päijänne and Saimaa. Using shore displacement chronology, the art has been dated to ca. 5000 – 1500 BC. It was thus created mainly during the Stone Age and can be associated with the so-called ‘Comb Ware’ cultures of the Subneolithic period. The range of motifs is rather limited, consisting mainly of schematic depictions of stick-figure humans, elks, boats, handprints and geometric signs. Few paintings include any evidence of narrative scenes, making their interpretation a rather difficult task. In Finnish archaeological literature, the paintings have traditionally been associated with ’sympathetic’ hunting magic, or the belief that the ritual shooting of the painted animals would increase hunting luck. Some writers have also suggested totemistic and shamanistic readings of the art. This dissertation is a critical review of the interpretations offered of Finnish rock art and an exploration of the potentials of archaeological and ethnographic research in increasing our knowledge of its meaning. Methods used include ’formal’ approaches such as archaeological excavation, landscape analysis and the application of neuropsychological research to the study of rock art, as well as ethnographically ’informed’ approaches that make use of Saami and Baltic Finnish ethnohistorical sources in interpretation. In conclusion, it is argued that although North European hunter-gatherer rock art is often thought to lie beyond the reach of ‘informed’ knowledge, the exceptional continuity of prehistoric settlement in Finland validates the informed approach in the interpretation of Finnish rock paintings. The art can be confidently associated with shamanism of the kind still practiced by the Saami of Northern Fennoscandia in the historical period. Evidence of similar shamanistic practices, concepts and cosmology are also found in traditional Finnish-Karelian epic poetry. Previous readings of the art based on ‘hunting magic’ and totemism are rejected. Most of the paintings appear to depict experiences of falling into a trance, of shamanic metamorphosis and trance journeys, and of ‘spirit helper’ beings comparable to those employed by the Saami shaman (noaidi). As demonstrated by the results of an excavation at the rock painting of Valkeisaari, the painted cliffs themselves find a close parallel in the Saami cult of the 'sieidi', or sacred cliffs and boulders worshipped as expressing a supernatural power. Like the Saami, the prehistoric inhabitants of the Finnish Lake Region seem to have believed that certain cliffs were ’alive’ and inhabited by the spirit helpers of the shaman. The rock paintings can thus be associated with shamanic vision quests, and the making of ‘art’ with an effort to socialize the other members of the community, especially the ritual specialists, with trance visions. However, the paintings were not merely to be looked at. The red ochre handprints pressed on images of elks, as well as the fact that many paintings appear ’smeared’, indicate that they were also to be touched – perhaps in order to tap into the supernatural potency inherent in the cliff and in the paintings of spirit animals.
Resumo:
The aim of this dissertation was to adapt a questionnaire for assessing students’ approaches to learning and their experiences of the teaching-learning environment. The aim was to explore the validity of the modified Experiences of Teaching and Learning Questionnaire (ETLQ) by examining how the instruments measure the underlying dimensions of student experiences and their learning. The focus was on the relation between students’ experiences of their teaching-learning environment and their approaches to learning. Moreover, the relation between students’ experiences and students’ and teachers’ conceptions of good teaching was examined. In Study I the focus was on the use of the ETLQ in two different contexts: Finnish and British. The study aimed to explore the similarities and differences between the factor structures that emerged from both data sets. The results showed that the factor structures concerning students’ experiences of their teaching-learning environment and their approaches to learning were highly similar in the two contexts. Study I also examined how students’ experiences of the teaching-learning environment are related to their approaches to learning in the two contexts. The results showed that students’ positive experiences of their teaching-learning environment were positively related to their deep approach to learning and negatively to the surface approach to learning in both the Finnish and British data sets. This result was replicated in Study II, which examined the relation between approaches to learning and experiences of the teaching-learning environment on a group level. Furthermore, Study II aimed to explore students’ approaches to learning and their experiences of the teaching-learning environment in different disciplines. The results showed that the deep approach to learning was more common in the soft sciences than in the hard sciences. In Study III, students’ conceptions of good teaching were explored by using qualitative methods, more precisely, by open-ended questions. The aim was to examine students’ conceptions, disciplinary differences and their relation to students’ approaches to learning. The focus was on three disciplines, which differed in terms of students’ experiences of their teaching-learning environment. The results showed that students’ conceptions of good teaching were in line with the theory of good teaching and there were disciplinary differences in their conceptions. Study IV examined university teachers’ conceptions of good teaching, which corresponded to the learning-focused approach to teaching. Furthermore, another aim in this doctoral dissertation was to compare the students’ and teachers’ conceptions of good teaching, the results of which showed that these conceptions appear to have similarities. The four studies indicated that the ETLQ appears to be a sufficiently robust measurement instrument in different contexts. Moreover, its strength is its ability to be at the same time a valid research instrument and a practical tool for enhancing the quality of students’ learning. In addition, the four studies emphasise that in order to enhance teaching and learning in higher education, various perspectives have to be taken into account. This study sheds light on the interaction between students’ approaches to learning, their conceptions of good teaching, their experiences of the teaching-learning environment, and finally, the disciplinary culture.
Resumo:
The aim of this dissertation was to explore teaching in higher education from the teachers’ perspective. Two of the four studies analysed the effect of pedagogical training on approaches to teaching and on self-efficacy beliefs of teachers on teaching. Of these two studies, Study I analysed the effect of pedagogical training by applying a cross-sectional setting. The results showed that short training made teachers less student-centred and decreased their self-efficacy beliefs, as reported by the teachers themselves. However, more constant training enhanced the adoption of a student-centred approach to teaching and increased the self-efficacy beliefs of teachers as well. The teacher-focused approach to teaching was more resistant to change. Study II, on the other hand, applied a longitudinal setting. The results implied that among teachers who had not acquired more pedagogical training after Study II there were no changes in the student-focused approach scale between the measurements. However, teachers who had participated in further pedagogical training scored significantly higher on the scale measuring the student-focused approach to teaching. There were positive changes in the self-efficacy beliefs of teachers among teachers who had not participated in further training as well as among those who had. However, the analysis revealed that those teachers had the least teaching experience. Again, the teacher-focused approach was more resistant to change. Study III analysed approaches to teaching qualitatively by using a large and multidisciplinary sample in order to capture the variation in descriptions of teaching. Two broad categories of description were found: the learning-focused and the content-focused approach to teaching. The results implied that the purpose of teaching separates the two categories. In addition, the study aimed to identify different aspects of teaching in the higher-education context. Ten aspects of teaching were identified. While Study III explored teaching on a general level, Study IV analysed teaching on an individual level. The aim was to explore consonance and dissonance in the kinds of combinations of approaches to teaching university teachers adopt. The results showed that some teachers were clearly and systematically either learning- or content-focused. On the other hand, profiles of some teachers consisted of combinations of learning- and content-focused approaches or conceptions making their profiles dissonant. Three types of dissonance were identified. The four studies indicated that pedagogical training organised for university teachers is needed in order to enhance the development of their teaching. The results implied that the shift from content-focused or dissonant profiles towards consonant learning-focused profiles is a slow process and that teachers’ conceptions of teaching have to be addressed first in order to promote learning-focused teaching.
Resumo:
This study addresses the following question: How to think about ethics in a technological world? The question is treated first thematically by framing central issues in the relationship between ethics and technology. This relationship has three distinct facets: i) technological advance poses new challenges for ethics, ii) traditional ethics may become poorly applicable in a technologically transformed world, and iii) the progress in science and technology has altered the concept of rationality in ways that undermine ethical thinking itself. The thematic treatment is followed by the description and analysis of three approaches to the questions framed. First, Hans Jonas s thinking on the ontology of life and the imperative of responsibility is studied. In Jonas s analysis modern culture is found to be nihilistic because it is unable to understand organic life, to find meaning in reality, and to justify morals. At the root of nihilism Jonas finds dualism, the traditional Western way of seeing consciousness as radically separate from the material world. Jonas attempts to create a metaphysical grounding for an ethic that would take the technologically increased human powers into account and make the responsibility for future generations meaningful and justified. The second approach is Albert Borgmann s philosophy of technology that mainly assesses the ways in which technological development has affected everyday life. Borgmann admits that modern technology has liberated humans from toil, disease, danger, and sickness. Furthermore, liberal democracy, possibilities for self-realization, and many of the freedoms we now enjoy would not be possible on a large scale without technology. Borgmann, however, argues that modern technology in itself does not provide a whole and meaningful life. In fact, technological conditions are often detrimental to the good life. Integrity in life, according to him, is to be sought among things and practices that evade technoscientific objectification and commodification. Larry Hickman s Deweyan philosophy of technology is the third approach under scrutiny. Central in Hickman s thinking is a broad definition of technology that is nearly equal to Deweyan inquiry. Inquiry refers to the reflective and experiential way humans adapt to their environment by modifying their habits and beliefs. In Hickman s work, technology consists of all kinds of activities that through experimentation and/or reflection aim at improving human techniques and habits. Thus, in addition to research and development, many arts and political reforms are technological for Hickman. He argues for recasting such distinctions as fact/value, poiesis/praxis/theoria, and individual/society. Finally, Hickman does not admit a categorical difference between ethics and technology: moral values and norms need to be submitted to experiential inquiry as well as all the other notions. This study mainly argues for an interdisciplinary approach to the ethics of technology. This approach should make use of the potentialities of the research traditions in applied ethics, the philosophy of technology, and the social studies on science and technology and attempt to overcome their limitations. This study also advocates an endorsement of mid-level ethics that concentrate on the practices, institutions, and policies of temporal human life. Mid-level describes the realm between the instantaneous and individualistic micro-level and the universal and global macro level.
Resumo:
One major reason for the global decline of biodiversity is habitat loss and fragmentation. Conservation areas can be designed to reduce biodiversity loss, but as resources are limited, conservation efforts need to be prioritized in order to achieve best possible outcomes. The field of systematic conservation planning developed as a response to opportunistic approaches to conservation that often resulted in biased representation of biological diversity. The last two decades have seen the development of increasingly sophisticated methods that account for information about biodiversity conservation goals (benefits), economical considerations (costs) and socio-political constraints. In this thesis I focus on two general topics related to systematic conservation planning. First, I address two aspects of the question about how biodiversity features should be valued. (i) I investigate the extremely important but often neglected issue of differential prioritization of species for conservation. Species prioritization can be based on various criteria, and is always goal-dependent, but can also be implemented in a scientifically more rigorous way than what is the usual practice. (ii) I introduce a novel framework for conservation prioritization, which is based on continuous benefit functions that convert increasing levels of biodiversity feature representation to increasing conservation value using the principle that more is better. Traditional target-based systematic conservation planning is a special case of this approach, in which a step function is used for the benefit function. We have further expanded the benefit function framework for area prioritization to address issues such as protected area size and habitat vulnerability. In the second part of the thesis I address the application of community level modelling strategies to conservation prioritization. One of the most serious issues in systematic conservation planning currently is not the deficiency of methodology for selection and design, but simply the lack of data. Community level modelling offers a surrogate strategy that makes conservation planning more feasible in data poor regions. We have reviewed the available community-level approaches to conservation planning. These range from simplistic classification techniques to sophisticated modelling and selection strategies. We have also developed a general and novel community level approach to conservation prioritization that significantly improves on methods that were available before. This thesis introduces further degrees of realism into conservation planning methodology. The benefit function -based conservation prioritization framework largely circumvents the problematic phase of target setting, and allowing for trade-offs between species representation provides a more flexible and hopefully more attractive approach to conservation practitioners. The community-level approach seems highly promising and should prove valuable for conservation planning especially in data poor regions. Future work should focus on integrating prioritization methods to deal with multiple aspects in combination influencing the prioritization process, and further testing and refining the community level strategies using real, large datasets.
Resumo:
Cord blood is a well-established alternative to bone marrow and peripheral blood stem cell transplantation. To this day, over 400 000 unrelated donor cord blood units have been stored in cord blood banks worldwide. To enable successful cord blood transplantation, recent efforts have been focused on finding ways to increase the hematopoietic progenitor cell content of cord blood units. In this study, factors that may improve the selection and quality of cord blood collections for banking were identified. In 167 consecutive cord blood units collected from healthy full-term neonates and processed at a national cord blood bank, mean platelet volume (MPV) correlated with the numbers of cord blood unit hematopoietic progenitors (CD34+ cells and colony-forming units); this is a novel finding. Mean platelet volume can be thought to represent general hematopoietic activity, as newly formed platelets have been reported to be large. Stress during delivery is hypothesized to lead to the mobilization of hematopoietic progenitor cells through cytokine stimulation. Accordingly, low-normal umbilical arterial pH, thought to be associated with perinatal stress, correlated with high cord blood unit CD34+ cell and colony-forming unit numbers. The associations were closer in vaginal deliveries than in Cesarean sections. Vaginal delivery entails specific physiological changes, which may also affect the hematopoietic system. Thus, different factors may predict cord blood hematopoietic progenitor cell numbers in the two modes of delivery. Theoretical models were created to enable the use of platelet characteristics (mean platelet volume) and perinatal factors (umbilical arterial pH and placental weight) in the selection of cord blood collections with high hematopoietic progenitor cell counts. These observations could thus be implemented as a part of the evaluation of cord blood collections for banking. The quality of cord blood units has been the focus of several recent studies. However, hemostasis activation during cord blood collection is scarcely evaluated in cord blood banks. In this study, hemostasis activation was assessed with prothrombin activation fragment 1+2 (F1+2), a direct indicator of thrombin generation, and platelet factor 4 (PF4), indicating platelet activation. Altogether three sample series were collected during the set-up of the cord blood bank as well as after changes in personnel and collection equipment. The activation decreased from the first to the subsequent series, which were collected with the bank fully in operation and following international standards, and was at a level similar to that previously reported for healthy neonates. As hemostasis activation may have unwanted effects on cord blood cell contents, it should be minimized. The assessment of hemostasis activation could be implemented as a part of process control in cord blood banks. Culture assays provide information about the hematopoietic potential of the cord blood unit. In processed cord blood units prior to freezing, megakaryocytic colony growth was evaluated in semisolid cultures with a novel scoring system. Three investigators analyzed the colony assays, and the scores were highly concordant. With such scoring systems, the growth potential of various cord blood cell lineages can be assessed. In addition, erythroid cells were observed in liquid cultures of cryostored and thawed, unseparated cord blood units without exogenous erythropoietin. This was hypothesized to be due to the erythropoietic effect of thrombopoietin, endogenous erythropoietin production, and diverse cell-cell interactions in the culture. This observation underscores the complex interactions of cytokines and supporting cells in the heterogeneous cell population of the thawed cord blood unit.
Resumo:
Cancer is a devastating disease with poor prognosis and no curative treatment, when widely metastatic. Conventional therapies, such as chemotherapy and radiotherapy, have efficacy but are not curative and systemic toxicity can be considerable. Almost all cancers are caused due to changes in the genetic material of the transformed cells. Cancer gene therapy has emerged as a new treatment option, and past decades brought new insights in developing new therapeutic drugs for curing cancer. Oncolytic viruses constitute a novel therapeutic approach given their capacity to replicate in and kill specifically tumor cells as well as reaching tumor distant metastasis. Adenoviral gene therapy has been suggested to cause liver toxicity. This study shows that new developed adenoviruses, in particular Ad5/19p-HIT, can be redirected towards kidney while adenovirus uptake by liver is minimal. Moreover, low liver transduction resulted in a favorable tumor to liver ratio of virus load. Further, we established a new immunocompetent animal model Syrian hamsters. Wild type adenovirus 5 was found to replicate in Hap-T1 hamster tumors and normal tissues. There are no antiviral drugs available to inhibit adenovirus replication. In our study, chlorpromazine and cidofovir efficiently abrogated virus replication in vitro and showed significant reduction in vivo in tumors and liver. Once safety concerns were addressed together with the new given antiviral treatment options, we further improved oncolytic adenoviruses for better tumor penetration, local amplification and host system modulation. Further, we created Ad5/3-9HIF-Δ24-VEGFR-1-Ig, oncolytic adenovirus for improved infectivity and antiangiogenic effect for treatment of renal cancer. This virus exhibited increased anti-tumor effect and specific replication in kidney cancer cells. The key player for good efficacy of oncolytic virotherapy is the host immune response. Thus, we engineered a triple targeted adenovirus Ad5/3-hTERT-E1A-hCD40L, which would lead to tumor elimination due to tumor-specific oncolysis and apoptosis together with an anti-tumor immune response prompted by the immunomodulatory molecule. In conclusion, the results presented in this thesis constitute advances in our understanding of oncolytic virotherapy by successful tumor targeting, antiviral treatment options as a safety switch in case of replication associated side-effects, and modulation of the host immune system towards tumor elimination.
Resumo:
Aerosols impact the planet and our daily lives through various effects, perhaps most notably those related to their climatic and health-related consequences. While there are several primary particle sources, secondary new particle formation from precursor vapors is also known to be a frequent, global phenomenon. Nevertheless, the formation mechanism of new particles, as well as the vapors participating in the process, remain a mystery. This thesis consists of studies on new particle formation specifically from the point of view of numerical modeling. A dependence of formation rate of 3 nm particles on the sulphuric acid concentration to the power of 1-2 has been observed. This suggests nucleation mechanism to be of first or second order with respect to the sulphuric acid concentration, in other words the mechanisms based on activation or kinetic collision of clusters. However, model studies have had difficulties in replicating the small exponents observed in nature. The work done in this thesis indicates that the exponents may be lowered by the participation of a co-condensing (and potentially nucleating) low-volatility organic vapor, or by increasing the assumed size of the critical clusters. On the other hand, the presented new and more accurate method for determining the exponent indicates high diurnal variability. Additionally, these studies included several semi-empirical nucleation rate parameterizations as well as a detailed investigation of the analysis used to determine the apparent particle formation rate. Due to their high proportion of the earth's surface area, oceans could potentially prove to be climatically significant sources of secondary particles. In the lack of marine observation data, new particle formation events in a coastal region were parameterized and studied. Since the formation mechanism is believed to be similar, the new parameterization was applied in a marine scenario. The work showed that marine CCN production is feasible in the presence of additional vapors contributing to particle growth. Finally, a new method to estimate concentrations of condensing organics was developed. The algorithm utilizes a Markov chain Monte Carlo method to determine the required combination of vapor concentrations by comparing a measured particle size distribution with one from an aerosol dynamics process model. The evaluation indicated excellent agreement against model data, and initial results with field data appear sound as well.
Resumo:
Nucleation is the first step of a first order phase transition. A new phase is always sprung up in nucleation phenomena. The two main categories of nucleation are homogeneous nucleation, where the new phase is formed in a uniform substance, and heterogeneous nucleation, when nucleation occurs on a pre-existing surface. In this thesis the main attention is paid on heterogeneous nucleation. This thesis wields the nucleation phenomena from two theoretical perspectives: the classical nucleation theory and the statistical mechanical approach. The formulation of the classical nucleation theory relies on equilibrium thermodynamics and use of macroscopically determined quantities to describe the properties of small nuclei, sometimes consisting of just a few molecules. The statistical mechanical approach is based on interactions between single molecules, and does not bear the same assumptions as the classical theory. This work gathers up the present theoretical knowledge of heterogeneous nucleation and utilizes it in computational model studies. A new exact molecular approach on heterogeneous nucleation was introduced and tested by Monte Carlo simulations. The results obtained from the molecular simulations were interpreted by means of the concepts of the classical nucleation theory. Numerical calculations were carried out for a variety of substances nucleating on different substances. The classical theory of heterogeneous nucleation was employed in calculations of one-component nucleation of water on newsprint paper, Teflon and cellulose film, and binary nucleation of water-n-propanol and water-sulphuric acid mixtures on silver nanoparticles. The results were compared with experimental results. The molecular simulation studies involved homogeneous nucleation of argon and heterogeneous nucleation of argon on a planar platinum surface. It was found out that the use of a microscopical contact angle as a fitting parameter in calculations based on the classical theory of heterogeneous nucleation leads to a fair agreement between the theoretical predictions and experimental results. In the presented cases the microscopical angle was found to be always smaller than the contact angle obtained from macroscopical measurements. Furthermore, molecular Monte Carlo simulations revealed that the concept of the geometrical contact parameter in heterogeneous nucleation calculations can work surprisingly well even for very small clusters.