21 resultados para Virtual elements

em Helda - Digital Repository of University of Helsinki


Relevância:

30.00% 30.00%

Publicador:

Resumo:

My thesis concerns the notion of existence as an encounter, as developed in the philosophy of Gilles Deleuze (1925 1995). What this denotes is a critical stance towards a major current in Western philosophical tradition which Deleuze nominates as representational thinking. Such thinking strives to provide a stable ground for identities by appealing to transcendent structures behind the apparent reality and explaining the manifest diversity of the given by such notions as essence, idea, God, or totality of the world. In contrast to this, Deleuze states that abstractions such as these do not explain anything, but rather that they need to be explained. Yet, Deleuze does not appeal merely to the given. He sees that one must posit a genetic element that accounts for experience, and this element must not be naïvely traced from the empirical. Deleuze nominates his philosophy as transcendental empiricism and he seeks to bring together the approaches of both empiricism and transcendental philosophy. In chapter one I look into the motivations of Deleuze s transcendental empiricism and analyse it as an encounter between Deleuze s readings of David Hume and Immanuel Kant. This encounter regards, first of all, the question of subjectivity and results in a conception of identity as non-essential process. A pre-given concept of identity does not explain the nature of things, but the concept itself must be explained. From this point of view, the process of individualisation must become the central concern. In chapter two I discuss Deleuze s concept of the affect as the basis of identity and his affiliation with the theories of Gilbert Simondon and Jakob von Uexküll. From this basis develops a morphogenetic theory of individuation-as-process. In analysing such a process of individuation, the modal category of the virtual becomes of great value, being an open, indeterminate charge of potentiality. As the virtual concerns becoming or the continuous process of actualisation, then time, rather than space, will be the privileged field of consideration. Chapter three is devoted to the discussion of the temporal aspect of the virtual and difference-without-identity. The essentially temporal process of subjectification results in a conception of the subject as composition: an assemblage of heterogeneous elements. Therefore art and aesthetic experience is valued by Deleuze because they disclose the construct-like nature of subjectivity in the sensations they produce. Through the domain of the aesthetic the subject is immersed in the network of affectivity that is the material diversity of the world. Chapter four addresses a phenomenon displaying this diversified indentity: the simulacrum an identity that is not grounded in an essence. Developed on the basis of the simulacrum, a theory of identity as assemblage emerges in chapter five. As the problematic of simulacra concerns perhaps foremost the artistic presentation, I shall look into the identity of a work of art as assemblage. To take an example of a concrete artistic practice and to remain within the problematic of the simulacrum, I shall finally address the question of reproduction particularly in the case recorded music and its identity regarding the work of art. In conclusion, I propose that by overturning its initial representational schema, phonographic music addresses its own medium and turns it into an inscription of difference, exposing the listener to an encounter with the virtual.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 26-hour English reading comprehension course was taught to two groups of second year Finnish Pharmacy students: a virtual group (33 students) and a teacher-taught group (25 students). The aims of the teaching experiment were to find out: 1.What has to be taken into account when teaching English reading comprehension to students of pharmacy via the Internet and using TopClass? 2. How will the learning outcomes of the virtual group and the control group differ? 3. How will the students and the Department of Pharmacy respond to the different and new method, i.e. the virtual teaching method? 4. Will it be possible to test English reading comprehension learning material using the groupware tool TopClass? The virtual exercises were written within the Internet authoring environment, TopClass. The virtual group was given the reading material and grammar booklet on paper, but they did the reading comprehension tasks (written by the teacher), autonomously via the Internet. The control group was taught by the same teacher in 12 2-hour sessions, while the virtual group could work independently within the given six weeks. Both groups studied the same material: ten pharmaceutical articles with reading comprehension tasks as well as grammar and vocabulary exercises. Both groups took the same final test. Students in both groups were asked to evaluate the course using a 1 to 5 rating scale and they were also asked to assess their respective courses verbally. A detailed analysis of the different aspects of the student evaluation is given. Conclusions: 1.The virtual students learned pharmaceutical English relatively well but not significantly better than the classroom students 2. The overall student satisfaction in the virtual pharmacy English reading comprehension group was found to be higher than that in the teacher-taught control group. 3. Virtual learning is easier for linguistically more able students; less able students need more time with the teacher. 4. The sample in this study is rather small, but it is a pioneering study. 5. The Department of Pharmacy in the University of Helsinki wishes to incorporate virtual English reading comprehension teaching in its curriculum. 6. The sophisticated and versatile TopClass system is relatively easy for a traditional teacher and quite easy for the students to learn. It can be used e.g. for automatic checking of routine answers and document transfer, which both lighten the workloads of both parties. It is especially convenient for teaching reading comprehension. Key words: English reading comprehension, teacher-taught class, virtual class, attitudes of students, learning outcomes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Strategies of scientific, question-driven inquiry are stated to be important cultural practices that should be educated in schools and universities. The present study focuses on investigating multiple efforts to implement a model of Progressive Inquiry and related Web-based tools in primary, secondary and university level education, to develop guidelines for educators in promoting students collaborative inquiry practices with technology. The research consists of four studies. In Study I, the aims were to investigate how a human tutor contributed to the university students collaborative inquiry process through virtual forums, and how the influence of the tutoring activities is demonstrated in the students inquiry discourse. Study II examined an effort to implement technology-enhanced progressive inquiry as a distance working project in a middle school context. Study III examined multiple teachers' methods of organizing progressive inquiry projects in primary and secondary classrooms through a generic analysis framework. In Study IV, a design-based research effort consisting of four consecutive university courses, applying progressive inquiry pedagogy, was retrospectively re-analyzed in order to develop the generic design framework. The results indicate that appropriate teacher support for students collaborative inquiry efforts appears to include interplay between spontaneity and structure. Careful consideration should be given to content mastery, critical working strategies or essential knowledge practices that the inquiry approach is intended to promote. In particular, those elements in students activities should be structured and directed, which are central to the aim of Progressive Inquiry, but which the students do not recognize or demonstrate spontaneously, and which are usually not taken into account in existing pedagogical methods or educational conventions. Such elements are, e.g., productive co-construction activities; sustained engagement in improving produced ideas and explanations; critical reflection of the adopted inquiry practices, and sophisticated use of modern technology for knowledge work. Concerning the scaling-up of inquiry pedagogy, it was concluded that one individual teacher can also apply the principles of Progressive Inquiry in his or her own teaching in many innovative ways, even under various institutional constraints. The developed Pedagogical Infrastructure Framework enabled recognizing and examining some central features and their interplay in the designs of examined inquiry units. The framework may help to recognize and critically evaluate the invisible learning-cultural conventions in various educational settings and can mediate discussions about how to overcome or change them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of the study was to analyze and facilitate collaborative design in a virtual learning environment (VLE). Discussions of virtual design in design education have typically focused on technological or communication issues, not on pedagogical issues. Yet in order to facilitate collaborative design, it is also necessary to address the pedagogical issues related to the virtual design process. In this study, the progressive inquiry model of collaborative designing was used to give a structural level of facilitation to students working in the VLE. According to this model, all aspects of inquiry, such as creating the design context, constructing a design idea, evaluating the idea, and searching for new information, can be shared in a design community. The study consists of three design projects: 1) designing clothes for premature babies, 2) designing conference bags for an international conference, and 3) designing tactile books for visually impaired children. These design projects constituted a continuum of design experiments, each of which highlighted certain perspectives on collaborative designing. The design experiments were organized so that the participants worked in design teams, both face-to-face and virtually. The first design experiment focused on peer collaboration among textile teacher students in the VLE. The second design experiment took into consideration end-users needs by using a participatory design approach. The third design experiment intensified computer-supported collaboration between students and domain experts. The virtual learning environments, in these design experiments, were designed to support knowledge-building pedagogy and progressive inquiry learning. These environments enabled a detailed recording of all computer-mediated interactions and data related to virtual designing. The data analysis was based on qualitative content analysis of design statements in the VLE. This study indicated four crucial issues concerning collaborative design in the VLE in craft and design education. Firstly, using the collaborative design process in craft and design education gives rise to special challenges of building learning communities, creating appropriate design tasks for them, and providing tools for collaborative activities. Secondly, the progressive inquiry model of collaborative designing can be used as a scaffold support for design thinking and for reflection on the design process. Thirdly, participation and distributed expertise can be facilitated by considering the key stakeholders who are related to the design task or design context, and getting them to participate in virtual designing. Fourthly, in the collaborative design process, it is important that team members create and improve visual and technical ideas together, not just agree or disagree about proposed ideas. Therefore, viewing the VLE as a medium for collaborative construction of the design objects appears crucial in order to understand and facilitate the complex processes in collaborative designing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent epidemiological studies have shown a consistent association of the mass concentration of urban air thoracic (PM10) and fine (PM2.5) particles with mortality and morbidity among cardiorespiratory patients. However, the chemical characteristics of different particulate size ranges and the biological mechanisms responsible for these adverse health effects are not well known. The principal aims of this thesis were to validate a high volume cascade impactor (HVCI) for the collection of particulate matter for physicochemical and toxicological studies, and to make an in-depth chemical and source characterisation of samples collected during different pollution situations. The particulate samples were collected with the HVCI, virtual impactors and a Berner low pressure impactor in six European cities: Helsinki, Duisburg, Prague, Amsterdam, Barcelona and Athens. The samples were analysed for particle mass, common ions, total and water-soluble elements as well as elemental and organic carbon. Laboratory calibration and field comparisons indicated that the HVCI can provide a unique large capacity, high efficiency sampling of size-segregated aerosol particles. The cutoff sizes of the recommended HVCI configuration were 2.4, 0.9 and 0.2 μm. The HVCI mass concentrations were in a good agreement with the reference methods, but the chemical composition of especially the fine particulate samples showed some differences. This implies that the chemical characterization of the exposure variable in toxicological studies needs to be done from the same HVCI samples as used in cell and animal studies. The data from parallel, low volume reference samplers provide valuable additional information for chemical mass closure and source assessment. The major components of PM2.5 in the virtual impactor samples were carbonaceous compounds, secondary inorganic ions and sea salt, whereas those of coarse particles (PM2.5-10) were soil-derived compounds, carbonaceous compounds, sea salt and nitrate. The major and minor components together accounted for 77-106% and 77-96% of the gravimetrically-measured masses of fine and coarse particles, respectively. Relatively large differences between sampling campaigns were observed in the organic carbon content of the PM2.5 samples as well as the mineral composition of the PM2.5-10 samples. A source assessment based on chemical tracers suggested clear differences in the dominant sources (e.g. traffic, residential heating with solid fuels, metal industry plants, regional or long-range transport) between the sampling campaigns. In summary, the field campaigns exhibited different profiles with regard to particulate sources, size distribution and chemical composition, thus, providing a highly useful setup for toxicological studies on the size-segregated HVCI samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents methods for locating and analyzing cis-regulatory DNA elements involved with the regulation of gene expression in multicellular organisms. The regulation of gene expression is carried out by the combined effort of several transcription factor proteins collectively binding the DNA on the cis-regulatory elements. Only sparse knowledge of the 'genetic code' of these elements exists today. An automatic tool for discovery of putative cis-regulatory elements could help their experimental analysis, which would result in a more detailed view of the cis-regulatory element structure and function. We have developed a computational model for the evolutionary conservation of cis-regulatory elements. The elements are modeled as evolutionarily conserved clusters of sequence-specific transcription factor binding sites. We give an efficient dynamic programming algorithm that locates the putative cis-regulatory elements and scores them according to the conservation model. A notable proportion of the high-scoring DNA sequences show transcriptional enhancer activity in transgenic mouse embryos. The conservation model includes four parameters whose optimal values are estimated with simulated annealing. With good parameter values the model discriminates well between the DNA sequences with evolutionarily conserved cis-regulatory elements and the DNA sequences that have evolved neutrally. In further inquiry, the set of highest scoring putative cis-regulatory elements were found to be sensitive to small variations in the parameter values. The statistical significance of the putative cis-regulatory elements is estimated with the Two Component Extreme Value Distribution. The p-values grade the conservation of the cis-regulatory elements above the neutral expectation. The parameter values for the distribution are estimated by simulating the neutral DNA evolution. The conservation of the transcription factor binding sites can be used in the upstream analysis of regulatory interactions. This approach may provide mechanistic insight to the transcription level data from, e.g., microarray experiments. Here we give a method to predict shared transcriptional regulators for a set of co-expressed genes. The EEL (Enhancer Element Locator) software implements the method for locating putative cis-regulatory elements. The software facilitates both interactive use and distributed batch processing. We have used it to analyze the non-coding regions around all human genes with respect to the orthologous regions in various other species including mouse. The data from these genome-wide analyzes is stored in a relational database which is used in the publicly available web services for upstream analysis and visualization of the putative cis-regulatory elements in the human genome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Volatility is central in options pricing and risk management. It reflects the uncertainty of investors and the inherent instability of the economy. Time series methods are among the most widely applied scientific methods to analyze and predict volatility. Very frequently sampled data contain much valuable information about the different elements of volatility and may ultimately reveal the reasons for time varying volatility. The use of such ultra-high-frequency data is common to all three essays of the dissertation. The dissertation belongs to the field of financial econometrics. The first essay uses wavelet methods to study the time-varying behavior of scaling laws and long-memory in the five-minute volatility series of Nokia on the Helsinki Stock Exchange around the burst of the IT-bubble. The essay is motivated by earlier findings which suggest that different scaling laws may apply to intraday time-scales and to larger time-scales, implying that the so-called annualized volatility depends on the data sampling frequency. The empirical results confirm the appearance of time varying long-memory and different scaling laws that, for a significant part, can be attributed to investor irrationality and to an intraday volatility periodicity called the New York effect. The findings have potentially important consequences for options pricing and risk management that commonly assume constant memory and scaling. The second essay investigates modelling the duration between trades in stock markets. Durations convoy information about investor intentions and provide an alternative view at volatility. Generalizations of standard autoregressive conditional duration (ACD) models are developed to meet needs observed in previous applications of the standard models. According to the empirical results based on data of actively traded stocks on the New York Stock Exchange and the Helsinki Stock Exchange the proposed generalization clearly outperforms the standard models and also performs well in comparison to another recently proposed alternative to the standard models. The distribution used to derive the generalization may also prove valuable in other areas of risk management. The third essay studies empirically the effect of decimalization on volatility and market microstructure noise. Decimalization refers to the change from fractional pricing to decimal pricing and it was carried out on the New York Stock Exchange in January, 2001. The methods used here are more accurate than in the earlier studies and put more weight on market microstructure. The main result is that decimalization decreased observed volatility by reducing noise variance especially for the highly active stocks. The results help risk management and market mechanism designing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background
How new forms arise in nature has engaged evolutionary biologists since Darwin's seminal treatise on the origin of species. Transposable elements (TEs) may be among the most important internal sources for intraspecific variability. Thus, we aimed to explore the temporal dynamics of several TEs in individual genotypes from a small, marginal population of Aegilops speltoides. A diploid cross-pollinated grass species, it is a wild relative of the various wheat species known for their large genome sizes contributed by an extraordinary number of TEs, particularly long terminal repeat (LTR) retrotransposons. The population is characterized by high heteromorphy and possesses a wide spectrum of chromosomal abnormalities including supernumerary chromosomes, heterozygosity for translocations, and variability in the chromosomal position or number of 45S and 5S ribosomal DNA (rDNA) sites. We propose that variability on the morphological and chromosomal levels may be linked to variability at the molecular level and particularly in TE proliferation.

Results
Significant temporal fluctuation in the copy number of TEs was detected when processes that take place in small, marginal populations were simulated. It is known that under critical external conditions, outcrossing plants very often transit to self-pollination. Thus, three morphologically different genotypes with chromosomal aberrations were taken from a wild population of Ae. speltoides, and the dynamics of the TE complex traced through three rounds of selfing. It was discovered that: (i) various families of TEs vary tremendously in copy number between individuals from the same population and the selfed progenies; (ii) the fluctuations in copy number are TE-family specific; (iii) there is a great difference in TE copy number expansion or contraction between gametophytes and sporophytes; and (iv) a small percentage of TEs that increase in copy number can actually insert at novel locations and could serve as a bona fide mutagen.

Conclusions
We hypothesize that TE dynamics could promote or intensify morphological and karyotypical changes, some of which may be potentially important for the process of microevolution, and allow species with plastic genomes to survive as new forms or even species in times of rapid climatic change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the existing research within the business network approach is based on companies that are operating on different levels within the same value chain, as a buyer and a supplier. Intercompetitor cooperation, i.e. cooperation between companies occupying the same level within different value chains, has not been studied to the same extent. Moreover scholars within the business network approach have usually described industrial relationships as long term, consisting of mutual commitment and trust. Industrial relationships are not static, but dynamic, and they contain situations of both harmony and conflict. There is consequently a need for more research both concerning intercompetitor cooperation and conflicts. The purpose of this study is to develop our theoretical and empirical understanding of the nature of conflicts in intercompetitor cooperation from a business network perspective. The focus of the study lies on issue and intensity of conflict. The issue of a conflict can be divided into cause and topic, while the intensity comprises the importance and outcome of a conflict. The empirical part of the study is based on two case studies of groups of cooperating competitors from two different industries. The applied research method is interviews. According to the findings of this study causes of conflicts in intercompetitor cooperation can be divided into three groups: focus, awareness and capacity. Topics of conflict can be related to domain, delivery, advertising or cooperation. Moreover the findings show that conflict situations may be grouped into not important, important or very important. Some conflicts may also be of varying importance, meaning that the importance varies from one point of time to another. Based on the findings of the study the outcome or status of a conflict can be analyzed both on a concrete and general level. The findings also indicate that several conflicts are partly hidden, which means that only one or some of the involved actors perceive the conflict. Furthermore several conflict situations can be related to external network actors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article expands the discussion of the impact of technology on services and contributes to a broader comprehension of the nature of virtual services. This is done by discovering dimensions that distinguish physical services from virtual services, i.e. services that are distributed by electronic means and where the customer has no direct human interaction with the service provider. Differences in the core characteristics of services, servicescape and service delivery are discussed. Moreover, dimensions that differentiate between virtual services are analysed. A classification scheme for virtual services is proposed, including the origin of the service, the element of the service offering, the customisation process, stage of the service process performed, and the degree of mobility of the service.