186 resultados para display technology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing attention has been focused on methods that deliver pharmacologically active compounds (e.g. drugs, peptides and proteins) in a controlled fashion, so that constant, sustained, site-specific or pulsatile action can be attained. Ion-exchange resins have been widely studied in medical and pharmaceutical applications, including controlled drug delivery, leading to commercialisation of some resin based formulations. Ion-exchangers provide an efficient means to adjust and control drug delivery, as the electrostatic interactions enable precise control of the ion-exchange process and, thus, a more uniform and accurate control of drug release compared to systems that are based only on physical interactions. Unlike the resins, only few studies have been reported on ion-exchange fibers in drug delivery. However, the ion-exchange fibers have many advantageous properties compared to the conventional ion-exchange resins, such as more efficient compound loading into and release from the ion-exchanger, easier incorporation of drug-sized compounds, enhanced control of the ion-exchange process, better mechanical, chemical and thermal stability, and good formulation properties, which make the fibers attractive materials for controlled drug delivery systems. In this study, the factors affecting the nature and strength of the binding/loading of drug-sized model compounds into the ion-exchange fibers was evaluated comprehensively and, moreover, the controllability of subsequent drug release/delivery from the fibers was assessed by modifying the conditions of external solutions. Also the feasibility of ion-exchange fibers for simultaneous delivery of two drugs in combination was studied by dual loading. Donnan theory and theoretical modelling were applied to gain mechanistic understanding on these factors. The experimental results imply that incorporation of model compounds into the ion-exchange fibers was attained mainly as a result of ionic bonding, with additional contribution of non-specific interactions. Increasing the ion-exchange capacity of the fiber or decreasing the valence of loaded compounds increased the molar loading, while more efficient release of the compounds was observed consistently at conditions where the valence or concentration of the extracting counter-ion was increased. Donnan theory was capable of fully interpreting the ion-exchange equilibria and the theoretical modelling supported precisely the experimental observations. The physico-chemical characteristics (lipophilicity, hydrogen bonding ability) of the model compounds and the framework of the fibrous ion-exchanger influenced the affinity of the drugs towards the fibers and may, thus, affect both drug loading and release. It was concluded that precisely controlled drug delivery may be tailored for each compound, in particularly, by choosing a suitable ion-exchange fiber and optimizing the delivery system to take into account the external conditions, also when delivering two drugs simultaneously.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis discusses the use of sub- and supercritical fluids as the medium in extraction and chromatography. Super- and subcritical extraction was used to separate essential oils from herbal plant Angelica archangelica. The effect of extraction parameters was studied and sensory analyses of the extracts were done by an expert panel. The results of the sensory analyses were compared to the analytically determined contents of the extracts. Sub- and supercritical fluid chromatography (SFC) was used to separate and purify high-value pharmaceuticals. Chiral SFC was used to separate the enantiomers of racemic mixtures of pharmaceutical compounds. Very low (cryogenic) temperatures were applied to substantially enhance the separation efficiency of chiral SFC. The thermodynamic aspects affecting the resolving ability of chiral stationary phases are briefly reviewed. The process production rate which is a key factor in industrial chromatography was optimized by empirical multivariate methods. General linear model was used to optimize the separation of omega-3 fatty acid ethyl esters from esterized fish oil by using reversed-phase SFC. Chiral separation of racemic mixtures of guaifenesin and ferulic acid dimer ethyl ester was optimized by using response surface method with three variables per time. It was found that by optimizing four variables (temperature, load, flowate and modifier content) the production rate of the chiral resolution of racemic guaifenesin by cryogenic SFC could be increased severalfold compared to published results of similar application. A novel pressure-compensated design of industrial high pressure chromatographic column was introduced, using the technology developed in building the deep-sea submersibles (Mir 1 and 2). A demonstration SFC plant was built and the immunosuppressant drug cyclosporine A was purified to meet the requirements of US Pharmacopoeia. A smaller semi-pilot size column with similar design was used for cryogenic chiral separation of aromatase inhibitor Finrozole for use in its development phase 2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many active pharmaceutical ingredients (APIs) have both anhydrate and hydrate forms. Due to the different physicochemical properties of solid forms, the changes in solid-state may result in therapeutic, pharmaceutical, legal and commercial problems. In order to obtain good solid dosage form quality and performance, there is a constant need to understand and control these phase transitions during manufacturing and storage. Thus it is important to detect and also quantify the possible transitions between the different forms. In recent years, vibrational spectroscopy has become an increasingly popular tool to characterise the solid-state forms and their phase transitions. It offers several advantages over other characterisation techniques including an ability to obtain molecular level information, minimal sample preparation, and the possibility of monitoring changes non-destructively in-line. Dehydration is the phase transition of hydrates which is frequently encountered during the dosage form production and storage. The aim of the present thesis was to investigate the dehydration behaviour of diverse pharmaceutical hydrates by near infrared (NIR), Raman and terahertz pulsed spectroscopic (TPS) monitoring together with multivariate data analysis. The goal was to reveal new perspectives for investigation of the dehydration at the molecular level. Solid-state transformations were monitored during dehydration of diverse hydrates on hot-stage. The results obtained from qualitative experiments were used to develop a method and perform the quantification of the solid-state forms during process induced dehydration in a fluidised bed dryer. Both in situ and in-line process monitoring and quantification was performed. This thesis demonstrated the utility of vibrational spectroscopy techniques and multivariate modelling to monitor and investigate dehydration behaviour in situ and during fluidised bed drying. All three spectroscopic methods proved complementary in the study of dehydration. NIR spectroscopy models could quantify the solid-state forms in the binary system, but were unable to quantify all the forms in the quaternary system. Raman spectroscopy models on the other hand could quantify all four solid-state forms that appeared upon isothermal dehydration. The speed of spectroscopic methods makes them applicable for monitoring dehydration and the quantification of multiple forms was performed during phase transition. Thus the solid-state structure information at the molecular level was directly obtained. TPS detected the intermolecular phonon modes and Raman spectroscopy detected mostly the changes in intramolecular vibrations. Both techniques revealed information about the crystal structure changes. NIR spectroscopy, on the other hand was more sensitive to water content and hydrogen bonding environment of water molecules. This study provides a basis for real time process monitoring using vibrational spectroscopy during pharmaceutical manufacturing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modern drug discovery gives rise to a great number of potential new therapeutic agents, but in some cases the efficient treatment of patient may not be achieved because the delivery of active compounds to the target site is insufficient. Thus, drug delivery is one of the major challenges in current pharmaceutical research. Numerous nanoparticle-based drug carriers, e.g. liposomes, have been developed for enhanced drug delivery and targeting. Drug targeting may enhance the efficiency of the treatment and, importantly, reduce unwanted side effects by decreasing drug distribution to non-target tissues. Liposomes are biocompatible lipid-based carriers that have been studied for drug delivery during the last 40 years. They can be functionalized with targeting ligands and sensing materials for triggered activation. In this study, various external signal-assisted liposomal delivery systems were developed. Signals can be used to modulate drug permeation or release from the liposome formulation, and they provide accurate control of time, place and rate of activation. The study involved three types of signals that were used to trigger drug permeation and release: electricity, heat and light. Electrical stimulus was utilized to enhance the permeation of liposomal DNA across the skin. Liposome/DNA complex-mediated transfections were performed in tight rat epidermal cell model. Various transfection media and current intensities were tested, and transfection efficiency was evaluated non-invasively by monitoring the concentration of secreted reporter protein in cell culture medium. Liposome/DNA complexes produced gene expression, but electrical stimulus did not enhance the transfection efficiency significantly. Heat-sensitive liposomal drug delivery system was developed by coating liposomes with biodegradable and thermosensitive poly(N-(2-hydroxypropyl) methacrylamide-mono/dilactate polymer. Temperature-triggered liposome aggregation and contents release from liposomes were evaluated. The cloud point temperature (CP) of the polymer was set to 42 °C. Polymer-coated liposome aggregation and contents release were observed above CP of the polymer, while non-coated liposomes remained intact. Polymer precipitates above its CP and interacts with liposomal bilayers. It is likely that this induces permeabilization of the liposomal membrane and contents release. Light-sensitivity was introduced to liposomes by incorporation of small (< 5 nm) gold nanoparticles. Hydrophobic and hydrophilic gold nanoparticles were embedded in thermosensitive liposomes, and contents release was investigated upon UV light exposure. UV light-induced lipid phase transitions were examined with small angle X-ray scattering, and light-triggered contents release was shown also in human retinal pigment epithelial cell line. Gold nanoparticles absorb light energy and transfer it into heat, which induces phase transitions in liposomes and triggers the contents release. In conclusion, external signal-activated liposomes offer an advanced platform for numerous applications in drug delivery, particularly in the localized drug delivery. Drug release may be localized to the target site with triggering stimulus that results in better therapeutic response and less adverse effects. Triggering signal and mechanism of activation can be selected according to a specific application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to improve and continuously develop the quality of pharmaceutical products, the process analytical technology (PAT) framework has been adopted by the US Food and Drug Administration. One of the aims of PAT is to identify critical process parameters and their effect on the quality of the final product. Real time analysis of the process data enables better control of the processes to obtain a high quality product. The main purpose of this work was to monitor crucial pharmaceutical unit operations (from blending to coating) and to examine the effect of processing on solid-state transformations and physical properties. The tools used were near-infrared (NIR) and Raman spectroscopy combined with multivariate data analysis, as well as X-ray powder diffraction (XRPD) and terahertz pulsed imaging (TPI). To detect process-induced transformations in active pharmaceutical ingredients (APIs), samples were taken after blending, granulation, extrusion, spheronisation, and drying. These samples were monitored by XRPD, Raman, and NIR spectroscopy showing hydrate formation in the case of theophylline and nitrofurantoin. For erythromycin dihydrate formation of the isomorphic dehydrate was critical. Thus, the main focus was on the drying process. NIR spectroscopy was applied in-line during a fluid-bed drying process. Multivariate data analysis (principal component analysis) enabled detection of the dehydrate formation at temperatures above 45°C. Furthermore, a small-scale rotating plate device was tested to provide an insight into film coating. The process was monitored using NIR spectroscopy. A calibration model, using partial least squares regression, was set up and applied to data obtained by in-line NIR measurements of a coating drum process. The predicted coating thickness agreed with the measured coating thickness. For investigating the quality of film coatings TPI was used to create a 3-D image of a coated tablet. With this technique it was possible to determine coating layer thickness, distribution, reproducibility, and uniformity. In addition, it was possible to localise defects of either the coating or the tablet. It can be concluded from this work that the applied techniques increased the understanding of physico-chemical properties of drugs and drug products during and after processing. They additionally provided useful information to improve and verify the quality of pharmaceutical dosage forms

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nature, science and technology. The image of Finland through popular enlightenment texts 1870-1920 This doctoral thesis looks at how Finnish popular enlightenment texts published between 1870 and 1920 took part in the process of forming a genuine Finnish national identity. The same process was occurring in other Nordic countries at the time and the process in Finland was in many ways influenced by them, particularly Sweden. In Finland the political realities under Russian rule especially during the Russification years, and the fact that its history was considered to be short compared to other European countries, made this nation-building process unique. The undertaking was led by members of the national elite, influential in the cultural, academic as well as political arenas, who were keen to support the foundation of a modern Finnish identity. The political realities and national philosophy of history necessitated a search for elements of identity in nature and the Finnish landscape, which were considered to have special national importance: Finland was very much determined as a political entity on the basis of its geography and nature. Nature was also used as means of taking a cultural or political view in terms of, for example, geographical facts such as the nation s borders or the country s geographical connections to Western Europe. In the building of a proper national identity the concept of nature was not, however, static, but was more or less affected by political and economic progress in society. This meant that nature, or the image of the national landscape, was no longer seen only as a visual image of the national identity, but also as a source of science, technology and a prosperous future. The role of technology in this process was very much connected to the ability to harness natural resources to serve national interests. The major change in this respect had occurred by the early 20th century, when indisputable scientific progress altered the relationship between nature and technology. Concerning technology, the thesis is mainly interested in the large and at the time modern technological manifestations, such as railways, factories and industrial areas in Finland. Despite the fact that the symbiosis between national nature and international but successfully localized technology was in Finnish popular enlightenment literature depicted mostly as a national success story, concerns began to arise already in last years of the 19th century. It was argued that the emerging technology would eventually destroy Finland s natural environment, and therefore the basis of its national identity. The question was not how to preserve nature through natural science, but more how to conserve such natural resources and images that were considered to be the basis of national identity and thus of the national history. National parks, isolated from technology, and distant enough so as to have no economic value, were considered the solution to the problem. Methodologically the thesis belongs to the genre of science and technology studies, and offers new viewpoints with regard to both the study of Finnish popular enlightenment literature and the national development process as a whole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis presents an interdisciplinary analysis of how models and simulations function in the production of scientific knowledge. The work is informed by three scholarly traditions: studies on models and simulations in philosophy of science, so-called micro-sociological laboratory studies within science and technology studies, and cultural-historical activity theory. Methodologically, I adopt a naturalist epistemology and combine philosophical analysis with a qualitative, empirical case study of infectious-disease modelling. This study has a dual perspective throughout the analysis: it specifies the modelling practices and examines the models as objects of research. The research questions addressed in this study are: 1) How are models constructed and what functions do they have in the production of scientific knowledge? 2) What is interdisciplinarity in model construction? 3) How do models become a general research tool and why is this process problematic? The core argument is that the mediating models as investigative instruments (cf. Morgan and Morrison 1999) take questions as a starting point, and hence their construction is intentionally guided. This argument applies the interrogative model of inquiry (e.g., Sintonen 2005; Hintikka 1981), which conceives of all knowledge acquisition as process of seeking answers to questions. The first question addresses simulation models as Artificial Nature, which is manipulated in order to answer questions that initiated the model building. This account develops further the "epistemology of simulation" (cf. Winsberg 2003) by showing the interrelatedness of researchers and their objects in the process of modelling. The second question clarifies why interdisciplinary research collaboration is demanding and difficult to maintain. The nature of the impediments to disciplinary interaction are examined by introducing the idea of object-oriented interdisciplinarity, which provides an analytical framework to study the changes in the degree of interdisciplinarity, the tools and research practices developed to support the collaboration, and the mode of collaboration in relation to the historically mutable object of research. As my interest is in the models as interdisciplinary objects, the third research problem seeks to answer my question of how we might characterise these objects, what is typical for them, and what kind of changes happen in the process of modelling. Here I examine the tension between specified, question-oriented models and more general models, and suggest that the specified models form a group of their own. I call these Tailor-made models, in opposition to the process of building a simulation platform that aims at generalisability and utility for health-policy. This tension also underlines the challenge of applying research results (or methods and tools) to discuss and solve problems in decision-making processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tolerance of Noise as a Necessity of Urban Life. Noise pollution as an environmental problem and its cultural perceptions in the city of Helsinki This study looks at the noise pollution problem and the change in the urban soundscape in the city of Helsinki during the period from the 1950s to the present day. The study investigates the formation of noise problems, the politicization of the noise pollution problem, noise-related civic activism, the development of environmental policies on noise, and the expectations that urban dwellers have had concerning their everyday soundscape. Both so-called street noise and the noise caused by, e.g., neighbors are taken into account. The study investigates whether our society contains or has for some time contained cultural and other elements that place noise pollution as an essential or normal state of affairs as part of urban life. It is also discussed whether we are moving towards an artificial soundscape, meaning that the auditory reality, the soundscape, is more and more under human control. The concept of an artificial soundscape was used to crystallize the significance of human actions and the role of modern technology in shaping soundscapes and also to link the changes in the modern soundscape to the economic, political, and social changes connected to the modernization process. It was argued that the critical period defining noise pollution as an environmental problem were the years from the end of the 1960s to the early 1970s. It seems that the massive increase of noise pollution caused by road traffic and the introduction of the utopian traffic plans was the key point that launched the moral protest against the increase of noise pollution, and in general, against the basic structures and mindsets of society, including attitudes towards nature. The study argues that after noise pollution was politicized and institutionalized, the urban soundscape gradually became the target of systematic interventions. However, for various reasons, such as the inconsistency in decision making, our increased capacity to shape the soundscape has not resulted in a healthy or pleasant urban soundscape. In fact the number of people exposed to noise pollution is increasing. It is argued that our society contains cultural and other elements that urge us to see noise as a normal part of urban life. It is also argued that the possibility of experiencing natural, silent soundscapes seems to be the yardstick against which citizens of Helsinki have measured how successful we are in designing the (artificial) soundscape and if the actions of noise control have been effective. This work discusses whose interests it serves when we are asked to accept noise pollution as a normal state of affairs. It is also suggested that the quality of the artificial soundscape ought to be radically politicized, which might give all citizens a better and more equal chance to express their needs and wishes concerning the urban soudscape, and also to decide how it ought to be designed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the autumn of 1997, Russian government was faced with media pressure when owners of the TV channels ORT and NTV joined forces against it. This study is based on media sources from October 1997 to December 1997. It shows clearly how the enormous power of the media was able to dictate what happened in Russia. In the mid-1990s Russians started to talk about political technology, which became a commonly used term by professionals, journalists, politicians and intelligence services. As a result of this action, two leading reformers in the government, Anatoliy Chubais and Boris Nemtsov, were dismissed from their highly influential posts as finance and energy ministers respectively, but retained their power as first deputy prime ministers. According to the correspondents, the real reason was to resolve a conflict within the parliament, which had demanded the dismissal of Mr. Chubais. This demand was presented after Chubais had accepted $90,000 as a reward for co-writing a book on privatization. Chubais was considered to be Russia’s “business card” towards the west – the"Authors’ case" (Delo avtorov) was only solved after President Boris Yeltsin took part in the public debate. According to the research, the media owned by powerful businessmen Boris Berezovsky and Vladimir Gusinski, was able to use its own security services to expose sensitive material (Russian term ‘kompromat’), if necessary, concerning any given person. The so-called Authors’ case can be considered as a part of the battle and the tip of the iceberg in arrangements designed to organize the funding of the Russian presidential election campaign in 2000. The reason why this particular incident was so widely covered on television was because several programs aimed to reveal to the public "hidden bribes" that, as they claimed, government officials had received. The political aspect, however, was quite mild, when the concrete issues of possible dismissals of Ministers were debated in the Parliament. Everything was dealt with as a “family matter” inside Kremlin. Yeltsin's "family" consisted of practically anybody from oligarch Berezovsky to Chubais, the father of Russia's privatization policy. Methods of critical history implementation analysis has been used in this research in determining the use of the source material. Literature and interviews have also provided a good base for the study. The study proves that any literature dealing with the subject has not paid enough attention to how the dismissal of Alexander Kazakov, deputy of President’s administration, was linked directly with Gazprom, the state gas monopoly. Kazakov had to leave Gazprom and lose his position as Chubais' ally when the influential ORT television company was deteriorated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aptitude-based student selection: A study concerning the admission processes of some technically oriented healthcare degree programmes in Finland (Orthotics and Prosthetics, Dental Technology and Optometry). The data studied consisted of conveniencesamples of preadmission information and the results of the admission processes of three technically oriented healthcare degree programmes (Orthotics and Prosthetics, Dental Technology and Optometry) in Finland during the years 1977-1986 and 2003. The number of the subjects tested and interviewed in the first samples was 191, 615 and 606, and in the second 67, 64 and 89, respectively. The questions of the six studies were: I. How were different kinds of preadmission data related to each other? II. Which were the major determinants of the admission decisions? III. Did the graduated students and those who dropped out differ from each other? IV. Was it possible to predict how well students would perform in the programmes? V. How was the student selection executed in the year 2003? VI. Should clinical vs. statistical prediction or both be used? (Some remarks are presented on Meehl's argument: "Always, we might as well face it, the shadow of the statistician hovers in the background; always the actuary will have the final word.") The main results of the study were as follows: Ability tests, dexterity tests and judgements of personality traits (communication skills, initiative, stress tolerance and motivation) provided unique, non-redundant information about the applicants. Available demographic variables did not bias the judgements of personality traits. In all three programme settings, four-factor solutions (personality, reasoning, gender-technical and age-vocational with factor scores) could be extracted by the Maximum Likelihood method with graphical Varimax rotation. The personality factor dominated the final aptitude judgements and very strongly affected the selection decisions. There were no clear differences between graduated students and those who had dropped out in regard to the four factors. In addition, the factor scores did not predict how well the students performed in the programmes. Meehl's argument on the uncertainty of clinical prediction was supported by the results, which on the other hand did not provide any relevant data for rules on statistical prediction. No clear arguments for or against the aptitude-based student selection was presented. However, the structure of the aptitude measures and their impact on the admission process are now better known. The concept of "personal aptitude" is not necessarily included in the values and preferences of those in charge of organizing the schooling. Thus, obviously the most well-founded and cost-effective way to execute student selection is to rely on e.g. the grade point averages of the matriculation examination and/or written entrance exams. This procedure, according to the present study, would result in a student group which has a quite different makeup (60%) from the group selected on the basis of aptitude tests. For the recruiting organizations, instead, "personal aptitude" may be a matter of great importance. The employers, of course, decide on personnel selection. The psychologists, if consulted, are responsible for the proper use of psychological measures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Distraction in the workplace is increasingly more common in the information age. Several tasks and sources of information compete for a worker's limited cognitive capacities in human-computer interaction (HCI). In some situations even very brief interruptions can have detrimental effects on memory. Nevertheless, in other situations where persons are continuously interrupted, virtually no interruption costs emerge. This dissertation attempts to reveal the mental conditions and causalities differentiating the two outcomes. The explanation, building on the theory of long-term working memory (LTWM; Ericsson and Kintsch, 1995), focuses on the active, skillful aspects of human cognition that enable the storage of task information beyond the temporary and unstable storage provided by short-term working memory (STWM). Its key postulate is called a retrieval structure an abstract, hierarchical knowledge representation built into long-term memory that can be utilized to encode, update, and retrieve products of cognitive processes carried out during skilled task performance. If certain criteria of practice and task processing are met, LTWM allows for the storage of large representations for long time periods, yet these representations can be accessed with the accuracy, reliability, and speed typical of STWM. The main thesis of the dissertation is that the ability to endure interruptions depends on the efficiency in which LTWM can be recruited for maintaing information. An observational study and a field experiment provide ecological evidence for this thesis. Mobile users were found to be able to carry out heavy interleaving and sequencing of tasks while interacting, and they exhibited several intricate time-sharing strategies to orchestrate interruptions in a way sensitive to both external and internal demands. Interruptions are inevitable, because they arise as natural consequences of the top-down and bottom-up control of multitasking. In this process the function of LTWM is to keep some representations ready for reactivation and others in a more passive state to prevent interference. The psychological reality of the main thesis received confirmatory evidence in a series of laboratory experiments. They indicate that after encoding into LTWM, task representations are safeguarded from interruptions, regardless of their intensity, complexity, or pacing. However, when LTWM cannot be deployed, the problems posed by interference in long-term memory and the limited capacity of the STWM surface. A major contribution of the dissertation is the analysis of when users must resort to poorer maintenance strategies, like temporal cues and STWM-based rehearsal. First, one experiment showed that task orientations can be associated with radically different patterns of retrieval cue encodings. Thus the nature of the processing of the interface determines which features will be available as retrieval cues and which must be maintained by other means. In another study it was demonstrated that if the speed of encoding into LTWM, a skill-dependent parameter, is slower than the processing speed allowed for by the task, interruption costs emerge. Contrary to the predictions of competing theories, these costs turned out to involve intrusions in addition to omissions. Finally, it was learned that in rapid visually oriented interaction, perceptual-procedural expectations guide task resumption, and neither STWM nor LTWM are utilized due to the fact that access is too slow. These findings imply a change in thinking about the design of interfaces. Several novel principles of design are presented, basing on the idea of supporting the deployment of LTWM in the main task.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Humans are a social species with the internal capability to process social information from other humans. To understand others behavior and to react accordingly, it is necessary to infer their internal states, emotions and aims, which are conveyed by subtle nonverbal bodily cues such as postures, gestures, and facial expressions. This thesis investigates the brain functions underlying the processing of such social information. Studies I and II of this thesis explore the neural basis of perceiving pain from another person s facial expressions by means of functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG). In Study I, observing another s facial expression of pain activated the affective pain system (previously associated with self-experienced pain) in accordance with the intensity of the observed expression. The strength of the response in anterior insula was also linked to the observer s empathic abilities. The cortical processing of facial pain expressions advanced from the visual to temporal-lobe areas at similar latencies (around 300 500 ms) to those previously shown for emotional expressions such as fear or disgust. Study III shows that perceiving a yawning face is associated with middle and posterior STS activity, and the contagiousness of a yawn correlates negatively with amygdalar activity. Study IV explored the brain correlates of interpreting social interaction between two members of the same species, in this case human and canine. Observing interaction engaged brain activity in very similar manner for both species. Moreover, the body and object sensitive brain areas of dog experts differentiated interaction from noninteraction in both humans and dogs whereas in the control subjects, similar differentiation occurred only for humans. Finally, Study V shows the engagement of the brain area associated with biological motion when exposed to the sounds produced by a single human being walking. However, more complex pattern of activation, with the walking sounds of several persons, suggests that as the social situation becomes more complex so does the brain response. Taken together, these studies demonstrate the roles of distinct cortical and subcortical brain regions in the perception and sharing of others internal states via facial and bodily gestures, and the connection of brain responses to behavioral attributes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study addresses four issues concerning technological product innovations. First, the nature of the very early phases or "embryonic stages" of technological innovation is addressed. Second, this study analyzes why and by what means people initiate innovation processes outside the technological community and the field of expertise of the established industry. In other words, this study addresses the initiation of innovation that occurs without the expertise of established organizations, such as technology firms, professional societies and research institutes operating in the technological field under consideration. Third, the significance of interorganizational learning processes for technological innovation is dealt with. Fourth, this consideration is supplemented by considering how network collaboration and learning change when formalized product development work and the commercialization of innovation advance. These issues are addressed through the empirical analysis of the following three product innovations: Benecol margarine, the Nordic Mobile Telephone system (NMT) and the ProWellness Diabetes Management System (PDMS). This study utilizes the theoretical insights of cultural-historical activity theory on the development of human activities and learning. Activity-theoretical conceptualizations are used in the critical assessment and advancement of the concept of networks of learning. This concept was originally proposed by the research group of organizational scientist Walter Powell. A network of learning refers to the interorganizational collaboration that pools resources, ideas and know-how without market-based or hierarchical relations. The concept of an activity system is used in defining the nodes of the networks of learning. Network collaboration and learning are analyzed with regard to the shared object of development work. According to this study, enduring dilemmas and tensions in activity explain the participants' motives for carrying out actions that lead to novel product concepts in the early phases of technological innovation. These actions comprise the initiation of development work outside the relevant fields of expertise and collaboration and learning across fields of expertise in the absence of market-based or hierarchical relations. These networks of learning are fragile and impermanent. This study suggests that the significance of networks of learning across fields of expertise becomes more and more crucial for innovation activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of head-mounted displays (HMDs) can produce both positive and negative experiences. In an effort increase positive experiences and avoid negative ones, researchers have identified a number of variables that may cause sickness and eyestrain, although the exact nature of the relationship to HMDs may vary, depending on the tasks and the environments. Other non-sickness-related aspects of HMDs, such as users opinions and future decisions associated with task enjoyment and interest, have attracted little attention in the research community. In this thesis, user experiences associated with the use of monocular and bi-ocular HMDs were studied. These include eyestrain and sickness caused by current HMDs, the advantages and disadvantages of adjustable HMDs, HMDs as accessories for small multimedia devices, and the impact of individual characteristics and evaluated experiences on reported outcomes and opinions. The results indicate that today s commercial HMDs do not induce serious sickness or eyestrain. Reported adverse symptoms have some influence on HMD-related opinions, but the nature of the impact depends on the tasks and the devices used. As an accessory to handheld devices and as a personal viewing device, HMDs may increase use duration and enable users to perform tasks not suitable for small screens. Well-designed and functional, adjustable HMDs, especially monocular HMDs, increase viewing comfort and usability, which in turn may have a positive effect on product-related satisfaction. The role of individual characteristics in understanding HMD-related experiences has not changed significantly. Explaining other HMD-related experiences, especially forward-looking interests, also requires understanding more stable individual traits and motivations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the 1990 s the companies utilizing and producing new information technology, especially so-called new media, were also expected to be forerunners in new forms of work and organization. Researchers anticipated that new, more creative forms of work and the changing content of working life were about to replace old industrial and standardized ways of working. However, research on actual companies in the IT sector revealed a situation where only minor changes to existing organizational forms were seen .Many of the independent companies faced great difficulties trying to survive the rapid changes in the products and production forms in the emerging field. Most of the research on the new media field has been conducted as surveys, and an understanding of the actual everyday work process has remained thin. My research is a longitudinal study of the early phases of one new media company in Finland. The study is an analysis of the challenges the company faced in a rapidly changing business field and the attempts to overcome these challenges. The two main analyses in the study focus on the developmental phases of the company and the disturbances in the production process. Based on these analyses, I study changes and learning at work using the methodological framework of developmental work research. Developmental work research is a Finnish variant of the cultural-historical activity theory applied to the study of learning and transformations at work. The data was gathered over a three-year period of ethnographic fieldwork. I documented the production processes and everyday life in the company as a participant observer. I interviewed key persons, video and audio-taped meetings, followed e-mail correspondence and collected various documents, such as agreements and memos. I developed a systematic method for analyzing the disturbances in the production process by combining the various data sources. The systematic analysis of the disturbances depicted a very complex and only partly managed production process. The production process had a long duration, and no single actor had an understanding of it as a whole. Most of the disturbances had to do with the customer relationships. The nature of the disturbances was latent; they were recognized but not addressed. In the particular production processes that I analyzed, the ending life span of a particular product, a CD-ROM, became obvious. This finding can be interpreted in relation to the developmental phase of the production and the transformation of the field as a whole. Based on the analysis of the developmental phases and the disturbances, I formulate a hypothesis of the contradictions and developmental potentials of the activity studied. The conclusions of the study challenge the existing understanding of how to conceptualize and study organizational learning in production work. Most theories of organizational learning do not address qualitative changes in production nor historical challenges of organizational learning itself. My study opens up a new horizon in understanding organizational learning in a rapidly changing field where a learning culture based on craft or mass production work is insufficient. There is a need for anticipatory and proactive organizational learning. Proactive learning is needed to anticipate the changes in production type, and the life cycles of products.