25 resultados para Traditional medicine
Influence of surface functionalization on the behavior of silica nanoparticles in biological systems
Resumo:
Personalized nanomedicine has been shown to provide advantages over traditional clinical imaging, diagnosis, and conventional medical treatment. Using nanoparticles can enhance and clarify the clinical targeting and imaging, and lead them exactly to the place in the body that is the goal of treatment. At the same time, one can reduce the side effects that usually occur in the parts of the body that are not targets for treatment. Nanoparticles are of a size that can penetrate into cells. Their surface functionalization offers a way to increase their sensitivity when detecting target molecules. In addition, it increases the potential for flexibility in particle design, their therapeutic function, and variation possibilities in diagnostics. Mesoporous nanoparticles of amorphous silica have attractive physical and chemical characteristics such as particle morphology, controllable pore size, and high surface area and pore volume. Additionally, the surface functionalization of silica nanoparticles is relatively straightforward, which enables optimization of the interaction between the particles and the biological system. The main goal of this study was to prepare traceable and targetable silica nanoparticles for medical applications with a special focus on particle dispersion stability, biocompatibility, and targeting capabilities. Nanoparticle properties are highly particle-size dependent and a good dispersion stability is a prerequisite for active therapeutic and diagnostic agents. In the study it was shown that traceable streptavidin-conjugated silica nanoparticles which exhibit a good dispersibility could be obtained by the suitable choice of a proper surface functionalization route. Theranostic nanoparticles should exhibit sufficient hydrolytic stability to effectively carry the medicine to the target cells after which they should disintegrate and dissolve. Furthermore, the surface groups should stay at the particle surface until the particle has been internalized by the cell in order to optimize cell specificity. Model particles with fluorescently-labeled regions were tested in vitro using light microscopy and image processing technology, which allowed a detailed study of the disintegration and dissolution process. The study showed that nanoparticles degrade more slowly outside, as compared to inside the cell. The main advantage of theranostic agents is their successful targeting in vitro and in vivo. Non-porous nanoparticles using monoclonal antibodies as guiding ligands were tested in vitro in order to follow their targeting ability and internalization. In addition to the targeting that was found successful, a specific internalization route for the particles could be detected. In the last part of the study, the objective was to clarify the feasibility of traceable mesoporous silica nanoparticles, loaded with a hydrophobic cancer drug, being applied for targeted drug delivery in vitro and in vivo. Particles were provided with a small molecular targeting ligand. In the study a significantly higher therapeutic effect could be achieved with nanoparticles compared to free drug. The nanoparticles were biocompatible and stayed in the tumor for a longer time than a free medicine did, before being eliminated by renal excretion. Overall, the results showed that mesoporous silica nanoparticles are biocompatible, biodegradable drug carriers and that cell specificity can be achieved both in vitro and in vivo.
Resumo:
In today’s world because of the rapid advancement in the field of technology and business, the requirements are not clear, and they are changing continuously in the development process. Due to those changes in the requirements the software development becomes very difficult. Use of traditional software development methods such as waterfall method is not a good option, as the traditional software development methods are not flexible to requirements and the software can be late and over budget. For developing high quality software that satisfies the customer, the organizations can use software development methods, such as agile methods which are flexible to change requirements at any stage in the development process. The agile methods are iterative and incremental methods that can accelerate the delivery of the initial business values through the continuous planning and feedback, and there is close communication between the customer and developers. The main purpose of the current thesis is to find out the problems in traditional software development and to show how agile methods reduced those problems in software development. The study also focuses the different success factors of agile methods, the success rate of agile projects and comparison between traditional and agile software development.
Resumo:
University of Turku, Faculty of Medicine, Department of Clinical Medicine, Department of Physical Activity and Health, Paavo Nurmi Centre, Doctoral Programme of Clinical Investigation, University of Turku, Turku, Finland. Annales Universitatis Turkuensis. Medica – Odontologica, Turku, Finland, 2014. Background: Atherosclerosis progression spans an entire lifetime and has a wide pool of risk factors. Oxidized LDL (oxLDL) is a crucial element in the progression of atherosclerosis. As a rather new member in the atherosclerosis risk factor family, its interaction with the traditional pro-atherogenic contributors that occur at different ages is poorly known. Aims: The aim of this study was to investigate oxLDL and its relation to major contributing risk factors in estimating atherosclerosis risk in data consisting mostly of adult men. The study subjects of this study consisted of four different sets of data, one of which contained also women. The age range of participants was 18-100 years and totaled 2337 participants (of whom 69% were men). Data on anthropometric and hormonal parameters, laboratory measures and medical records were assessed during 1998-2009. Results: Obesity was paralleled with high concentrations of oxLDL, which consequentially was reduced by weight reduction. Importantly, successful weight maintenance preserveed this benefit. A shift from insulin sensitivity to insulin resistance increased oxLDL. Smokers had more oxLDL than non-smokers. A combination of obesity and smoking, or smoking and low serum total testosterone,resulted in even higher levels of oxLDL than any of the three conditions alone. Proportioning oxLDL to HDL-c or apoA1 stood out as a risk factor of all-cause mortality in the elderly. Conclusions: OxLDL was associated with aging, androgens, smoking, obesity, insulin metabolism, weight balance and other circulating lipid classes. Through this variety of metabolic environments containing both constant conditions (aging and gender) as well as lifestyle issues, these findings supported an essential and multidimensional role that oxLDL plays in atherosclerosis pathogenesis.
Resumo:
This study discusses the significance of having service as a business logic, and more specifically, how value co-creation can be seen as an enhancing phenomenon to business-to-business relationships in traditional business sector. The purpose of this study is to investigate how value cocreation can enhance a business-to-business relationship in the heating, ventilation and airconditioning (HVAC) industry of building services engineering, through three sub-objectives: to identify what is value in the industry, how value is co-created in the industry, and what is value in a business-to-business relationship in the industry. The theoretical part this study consists of academic knowledge and literature related to the concepts of value, value co-creation and business-to-business relationships. In order to research value co-creation and business-to-business relationships in HVAC industry of building services engineering both, metaphorical and conceptual thinking of service dominant (S-D) logic and more managerial approach of service logic (SL), contributed to the theoretical part of the study. The empirical research conducted for this study is based on seven semi-structured interviews, which constituted the holistic, qualitative single case study method chosen for the research. The data was collected in September 2014 from CEOs, managers and owners representing six building services engineering firms. The interviews were analysed with the help of transcriptions, role-ordered matrices and thematic networks. The findings of this study indicate that value in HVAC industry consists of client expertise and supplier expertise. The result of applying client expertise and supplier expertise to the business-to- business relationship is characterized as value-in-reputation, when continuity, interaction, learning and rapport of the business relationship are ensured. As a result, value co-creation in the industry consists of mutual and separate elements, which the client and the supplier apply in the process, in addition to proactive interaction. The findings of this study, together with the final framework, enhance the understanding of the connection existing between value co-creation and business-to-business relationship. The findings suggest that value in the HVAC industry is characterized by both value-in-use and value-inreputation. Value-in-reputation enhances the formation of value-in-use, and consequently, value cocreation enhances the business-to-business relationship. This study thus contributes to the existing knowledge on the concepts of value and value co-creation in business-to-business relationships.
Resumo:
The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.
Resumo:
The National Library of Finland is implementing the Digitization Project of Kindred Languages in 2012–16. Within the project we will digitize materials in the Uralic languages as well as develop tools to support linguistic research and citizen science. Through this project, researchers will gain access to new corpora 329 and to which all users will have open access regardless of their place of residence. Our objective is to make sure that the new corpora are made available for the open and interactive use of both the academic community and the language societies as a whole. The project seeks to digitize and publish approximately 1200 monograph titles and more than 100 newspapers titles in various Uralic languages. The digitization will be completed by the early of 2015, when the Fenno-Ugrica collection would contain around 200 000 pages of editable text. The researchers cannot spend so much time with the material that they could retrieve a satisfactory amount of edited words, so the participation of a crowd in editing work is needed. Often the targets in crowdsourcing have been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguistic research are not necessarily met. Also, the number of pages is too high to deal with. The remarkable downside is the lack of shared goal or social affinity. There is no reward in traditional methods of crowdsourcing. Nichesourcing is a specific type of crowdsourcing where tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for the complex tasks with high-quality product expectations found in nichesourcing. Communities have purpose, identity and their regular interactions engenders social trust and reputation. These communities can correspond to research more precisely. Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. Some selection must be made, since we are not aiming to correct all 200,000 pages which we have digitized, but give such assignments to citizen scientists that would precisely fill the gaps in linguistic research. A typical task would editing and collecting the words in such fields of vocabularies, where the researchers do require more information. For instance, there’s a lack of Hill Mari words in anatomy. We have digitized the books in medicine and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with OCR editor. From the nichesourcing’s perspective, it is essential that the altruism plays a central role, when the language communities involve. Upon the nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit on the results. For instance, the corrected words in Ingrian will be added onto the online dictionary, which is made freely available for the public and the society can benefit too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of “two masters”, the research and the society.
Resumo:
Organizations often consider investing in a new Enterprise Resource Planning (ERP) system as a way to enhance their business processes, as it allows integrating information used by multiple different departments into a harmonized computing system. The hope of gaining significant business benefits, such as reducing operating costs, is the key reason why organizations have decided to invest in ERP systems since 1990’s. Still, all ERP projects do not end up in success, and deployment of ERP system does not necessarily guarantee the results people were waiting for. This research studies why organizations invest in ERP, but also what downsides ERP projects currently have. Additionally Enterprise Application Integrations (EAI) as next generation’s ERP solutions are studied to challenge and develop traditional ERP. The research questions are: What are the weaknesses in traditional ERP deployment in today’s business? How does the proposed next generation’s ERP answer to these weaknesses? At the beginning of the thesis, as an answer to the first research question, the basics of ERP implementation are introduced with both the pros and cons of investing in ERP. Key concepts such as IS integration and EAI are also studied. Empirical section of the thesis focuses on answering the second research question from the integration approach. A qualitative research is executed by interviewing five experienced IT professionals about EAI benefits, limitations, and problems. The thematic interview and questionnaire follow the presented ERP main elements from literature. The research shows that adopting traditional ERP includes multiple downsides, e.g. inflexibility and requiring big investments in terms of money. To avoid these critical issues, organizations could find a solution from integrations between their current IS. Based on the empirical study a new framework for the next generation’s ERP is created, consisting of a model and a framework that deal with various features regarding IS adoption. With this framework organizations can assess whether they should implement EAI or ERP. The model and framework suggest that there are multiple factors IT managers needs to consider when planning their IT investments, including their current IS, role of IT in the organization, as well as new system’s flexibility, investment level, and number of vendors. The framework created in the thesis encourages IT management to assess holistically their i) organization, ii) its IT, and iii) solution requirements in order to determine what kind of IS solution would suit their needs the best.
Resumo:
Organizations often consider investing in a new Enterprise Resource Planning (ERP) system as a way to enhance their business processes, as it allows integrating information used by multiple different departments into a harmonized computing system. The hope of gaining significant business benefits, such as reducing operating costs, is the key reason why organizations have decided to invest in ERP systems since 1990’s. Still, all ERP projects do not end up in success, and deployment of ERP system does not necessarily guarantee the results people were waiting for. This research studies why organizations invest in ERP, but also what downsides ERP projects currently have. Additionally Enterprise Application Integrations (EAI) as next generation’s ERP solutions are studied to challenge and develop traditional ERP. The research questions are: What are the weaknesses in traditional ERP deployment in today’s business? How does the proposed next generation’s ERP answer to these weaknesses? At the beginning of the thesis, as an answer to the first research question, the basics of ERP implementation are introduced with both the pros and cons of investing in ERP. Key concepts such as IS integration and EAI are also studied. Empirical section of the thesis focuses on answering the second research question from the integration approach. A qualitative research is executed by interviewing five experienced IT professionals about EAI benefits, limitations, and problems. The thematic interview and questionnaire follow the presented ERP main elements from literature. The research shows that adopting traditional ERP includes multiple downsides, e.g. inflexibility and requiring big investments in terms of money. To avoid these critical issues, organizations could find a solution from integrations between their current IS. Based on the empirical study a new framework for the next generation’s ERP is created, consisting of a model and a framework that deal with various features regarding IS adoption. With this framework organizations can assess whether they should implement EAI or ERP. The model and framework suggest that there are multiple factors IT managers needs to consider when planning their IT investments, including their current IS, role of IT in the organization, as well as new system’s flexibility, investment level, and number of vendors. The framework created in the thesis encourages IT management to assess holistically their i) organization, ii) its IT, and iii) solution requirements in order to determine what kind of IS solution would suit their needs the best.
Resumo:
The general aim of the thesis was to study university students’ learning from the perspective of regulation of learning and text processing. The data were collected from the two academic disciplines of medical and teacher education, which share the features of highly scheduled study, a multidisciplinary character, a complex relationship between theory and practice and a professional nature. Contemporary information society poses new challenges for learning, as it is not possible to learn all the information needed in a profession during a study programme. Therefore, it is increasingly important to learn how to think and learn independently, how to recognise gaps in and update one’s knowledge and how to deal with the huge amount of constantly changing information. In other words, it is critical to regulate one’s learning and to process text effectively. The thesis comprises five sub-studies that employed cross-sectional, longitudinal and experimental designs and multiple methods, from surveys to eye tracking. Study I examined the connections between students’ study orientations and the ways they regulate their learning. In total, 410 second-, fourth- and sixth-year medical students from two Finnish medical schools participated in the study by completing a questionnaire measuring both general study orientations and regulation strategies. The students were generally deeply oriented towards their studies. However, they regulated their studying externally. Several interesting and theoretically reasonable connections between the variables were found. For instance, self-regulation was positively correlated with deep orientation and achievement orientation and was negatively correlated with non-commitment. However, external regulation was likewise positively correlated with deep orientation and achievement orientation but also with surface orientation and systematic orientation. It is argued that external regulation might function as an effective coping strategy in the cognitively loaded medical curriculum. Study II focused on medical students’ regulation of learning and their conceptions of the learning environment in an innovative medical course where traditional lectures were combined wth problem-based learning (PBL) group work. First-year medical and dental students (N = 153) completed a questionnaire assessing their regulation strategies of learning and views about the PBL group work. The results indicated that external regulation and self-regulation of the learning content were the most typical regulation strategies among the participants. In line with previous studies, self-regulation wasconnected with study success. Strictly organised PBL sessions were not considered as useful as lectures, although the students’ views of the teacher/tutor and the group were mainly positive. Therefore, developers of teaching methods are challenged to think of new solutions that facilitate reflection of one’s learning and that improve the development of self-regulation. In Study III, a person-centred approach to studying regulation strategies was employed, in contrast to the traditional variable-centred approach used in Study I and Study II. The aim of Study III was to identify different regulation strategy profiles among medical students (N = 162) across time and to examine to what extent these profiles predict study success in preclinical studies. Four regulation strategy profiles were identified, and connections with study success were found. Students with the lowest self-regulation and with an increasing lack of regulation performed worse than the other groups. As the person-centred approach enables us to individualise students with diverse regulation patterns, it could be used in supporting student learning and in facilitating the early diagnosis of learning difficulties. In Study IV, 91 student teachers participated in a pre-test/post-test design where they answered open-ended questions about a complex science concept both before and after reading either a traditional, expository science text or a refutational text that prompted the reader to change his/her beliefs according to scientific beliefs about the phenomenon. The student teachers completed a questionnaire concerning their regulation and processing strategies. The results showed that the students’ understanding improved after text reading intervention and that refutational text promoted understanding better than the traditional text. Additionally, regulation and processing strategies were found to be connected with understanding the science phenomenon. A weak trend showed that weaker learners would benefit more from the refutational text. It seems that learners with effective learning strategies are able to pick out the relevant content regardless of the text type, whereas weaker learners might benefit from refutational parts that contrast the most typical misconceptions with scientific views. The purpose of Study V was to use eye tracking to determine how third-year medical studets (n = 39) and internal medicine residents (n = 13) read and solve patient case texts. The results revealed differences between medical students and residents in processing patient case texts; compared to the students, the residents were more accurate in their diagnoses and processed the texts significantly faster and with a lower number of fixations. Different reading patterns were also found. The observed differences between medical students and residents in processing patient case texts could be used in medical education to model expert reasoning and to teach how a good medical text should be constructed. The main findings of the thesis indicate that even among very selected student populations, such as high-achieving medical students or student teachers, there seems to be a lot of variation in regulation strategies of learning and text processing. As these learning strategies are related to successful studying, students enter educational programmes with rather different chances of managing and achieving success. Further, the ways of engaging in learning seldom centre on a single strategy or approach; rather, students seem to combine several strategies to a certain degree. Sometimes, it can be a matter of perspective of which way of learning can be considered best; therefore, the reality of studying in higher education is often more complicated than the simplistic view of self-regulation as a good quality and external regulation as a harmful quality. The beginning of university studies may be stressful for many, as the gap between high school and university studies is huge and those strategies that were adequate during high school might not work as well in higher education. Therefore, it is important to map students’ learning strategies and to encourage them to engage in using high-quality learning strategies from the beginning. Instead of separate courses on learning skills, the integration of these skills into course contents should be considered. Furthermore, learning complex scientific phenomena could be facilitated by paying attention to high-quality learning materials and texts and other support from the learning environment also in the university. Eye tracking seems to have great potential in evaluating performance and growing diagnostic expertise in text processing, although more research using texts as stimulus is needed. Both medical and teacher education programmes and the professions themselves are challenging in terms of their multidisciplinary nature and increasing amounts of information and therefore require good lifelong learning skills during the study period and later in work life.