918 resultados para Reading strategies and techniques
Resumo:
La comunicación entre los enfermeros y los pacientes oncológicos es fundamental en la construcción de la relación profesional y terapéutica y esencial para administrar unos cuidados realmente enfocados en la persona como ser holístico y no como entidad patológica. Diferentes estudios han demostrado la influencia positiva de la comunicación en la satisfacción del paciente e incluso se ha encontrado relación entre una comunicación efectiva y una mayor adherencia al tratamiento, mejor control del dolor y estado psicológico. La comunicación, como herramienta para establecer una relación terapéutica eficaz, a su vez básica para el cuidado de cualquier paciente, es entonces “la herramienta” y prerrequisito indispensable para cuidar estos pacientes desde una perspectiva holística. Pese a la centralidad en el cuidado enfermero, la comunicación no se emplea en modo correcto en muchos casos. Objetivos: Este trabajo tiene como objetivo individuar las principales habilidades (skills) para lograr una comunicación terapéutica eficaz y cómo emplearlas en la construcción y mantenimiento de la relación terapéutica con el paciente y su familia. Método: búsqueda bibliográfica con las siguientes palabras clave: comunicación, paliativos, enfermería. Se han incluido en la revisión 27 artículos de 17 revistas distintas. Resultados: Las habilidades y factores encontrados en la literatura han sido clasificados en: a. barreras a la comunicación terapéutica b. condiciones y habilidades facilitadoras de la comunicaciónvi c. habilidades de relación d. habilidades para solicitar información e. estrategias y modelos de comunicación Conclusiones: La comunicación es la herramienta principal del cuidado enfermero en cuidados paliativos, difiere de la comunicación social y tiene como objetivo aumentar la calidad de vida del paciente. La habilidad en comunicación no es un don innato sino que es el resultado de un proceso de aprendizaje continuo. Entre las habilidades más citadas y efectivas hay que recordar la escucha activa (entendida como conjunto de técnicas), el tacto terapéutico, el contacto visual, la empatía y la importancia fundamental de la comunicación no verbal. El modelo de comunicación COMFORT es el único centrado en el paciente y a la vez en su familia. Palabras clave: Habilidades comunicación, paliativos, enfermería
Resumo:
We expose the ubiquitous interaction between an information screen and its’ viewers mobile devices, highlights the communication vulnerabilities, suggest mitigation strategies and finally implement these strategies to secure the communication. The screen infers information preferences’ of viewers within its vicinity transparently from their mobile devices over Bluetooth. Backend processing then retrieves up-to-date versions of preferred information from content providers. Retrieved content such as sporting news, weather forecasts, advertisements, stock markets and aviation schedules, are systematically displayed on the screen. To maximise users’ benefit, experience and acceptance, the service is provided with no user interaction at the screen and securely upholding preferences privacy and viewers anonymity. Compelled by the personal nature of mobile devices, their contents privacy, preferences confidentiality, and vulnerabilities imposed by screen, the service’s security is fortified. Fortification is predominantly through efficient cryptographic algorithms inspired by elliptic curves cryptosystems, access control and anonymity mechanisms. These mechanisms are demonstrated to attain set objectives within reasonable performance.
Resumo:
The writing of I is a project that starts an itinerary through past, present and future experiences of each of our students based on following research activities. Reading, creation and recreation of text and other items that turn essentially around autobiographical writing and culminate with the elaboration of a free autobiography
Resumo:
Speaker diarization is the process of sorting speeches according to the speaker. Diarization helps to search and retrieve what a certain speaker uttered in a meeting. Applications of diarization systemsextend to other domains than meetings, for example, lectures, telephone, television, and radio. Besides, diarization enhances the performance of several speech technologies such as speaker recognition, automatic transcription, and speaker tracking. Methodologies previously used in developing diarization systems are discussed. Prior results and techniques are studied and compared. Methods such as Hidden Markov Models and Gaussian Mixture Models that are used in speaker recognition and other speech technologies are also used in speaker diarization. The objective of this thesis is to develop a speaker diarization system in meeting domain. Experimental part of this work indicates that zero-crossing rate can be used effectively in breaking down the audio stream into segments, and adaptive Gaussian Models fit adequately short audio segments. Results show that 35 Gaussian Models and one second as average length of each segment are optimum values to build a diarization system for the tested data. Uniting the segments which are uttered by same speaker is done in a bottom-up clustering by a newapproach of categorizing the mixture weights.
Resumo:
Engelskans dominerande roll som internationellt språk och andra globaliseringstrender påverkar också Svenskfinland. Dessa trender påverkar i sin tur förutsättningarna för lärande och undervisning i engelska som främmande språk, det vill säga undervisningsmålen, de förväntade elev- och lärarroller, materialens ändamålsenlighet, lärares och elevers initiala erfarenheter av engelska och engelskspråkiga länder. Denna studie undersöker förutsättningarna för lärande och professionell utveckling i det svenskspråkiga nybörjarklassrummet i engelska som främmande språk. Utgångsläget för 351 nybörjare i engelska som främmande språk och 19 av deras lärare beskrivs och analyseras. Resultaten tyder på att engelska håller på att bli ett andraspråk snarare än ett traditionellt främmande språk för många unga elever. Dessa elever har också goda förutsättningar att lära sig engelska utanför skolan. Sådan var dock inte situationen för alla elever, vilket tyder på att det finns en anmärkningsvärd heterogenitet och även regional variation i det finlandssvenska klassrummet i engelska som främmande språk. Lärarresultaten tyder på att vissa lärare har klarat av att på ett konstruktivt sätt att tackla de förutsättningar de möter. Andra lärare uttrycker frustration över sin arbetssituation, läroplanen, undervisningsmaterialen och andra aktörer som kommer är av betydelse för skolmiljön. Studien påvisar att förutsättningarna för lärande och undervisning i engelska som främmande språk varierar i Svenskfinland. För att stöda elevers och lärares utveckling föreslås att dialogen mellan aktörer på olika nivå i samhället bör förbättras och systematiseras.
Resumo:
Several recent works in history and philosophy of science have re-evaluated the alleged opposition between the theses put forth by logical empiricists such as Carnap and the so-called "post-positivists", such as Kuhn. Although the latter came to be viewed as having seriously challenged the logical positivist views of science, recent authors (e.g., Friedman, Reisch, Earman, Irzik and Grünberg) maintain that some of the most notable theses of the Kuhnian view of science have striking similarities with some aspects of Carnap's philosophy. Against that reading, Oliveira and Psillos argue that within Carnap's philosophy there is no place for the Kuhnian theses of incommensurability, holism, and theory-ladenness of observations. This paper presents each of those readings and argues that Carnap and Kuhn have non-opposing views on holism, incommensurability, the theory-ladenness of observations, and scientific revolutions. We note at the very end - without dwelling on the point, however - that they come apart on other matters, such as their views on metaphysics and on the context of discovery/justification distinction.
Resumo:
This study focuses on the integration of eco-innovation principles into strategy and policy at the regional level. The importance of regions as a level for integrating eco-innovative programs and activities served as the point of interest for this study. Eco-innovative activities and technologies are seen as means to meet sustainable development objective of improving regions’ quality of life. This study is conducted to get an in-depth understanding and learning about eco-innovation at regional level, and to know the basic concepts that are important in integrating eco-innovation principles into regional policy. Other specific objectives of this study are to know how eco-innovation are developed and practiced in the regions of the EU, and to analyze the main characteristic features of an eco-innovation model that is specifically developed at Päijät-Häme Region in Finland. Paijät-Häme Region is noted for its successful eco-innovation strategies and programs, hence, taken as casework in this study. Both primary (interviews) and secondary data (publicly available documents) are utilized in this study. The study shows that eco-innovation plays an important role in regional strategy as reviewed based on the experience of other regions in the EU. This is because of its localized nature which makes it easier to facilitate in a regional setting. Since regional authorities and policy-makers are normally focused on solving its localized environmental problems, eco-innovation principles can easily be integrated into regional strategy. The case study highlights Päijät-Häme Region’s eco-innovation strategies and projects which are characterized by strong connection of knowledge-producing institutions. Policy instruments supporting eco-innovation (e.g. environmental technologies) are very much focused on clean technologies, hence, justifying the formation of cleantech clusters and business parks in Päijät-Häme Region. A newly conceptualized SAMPO model of eco-innovation has been developed in Päijät-Häme Region to better capture the region’s characteristics and to eventually replace the current model employed by the Päijät-Häme Regional Authority. The SAMPO model is still under construction, however, review of its principles points to some of its three important spearheads – practice-based innovation, design (eco-design) and clean technology or environmental technology (environment).
Resumo:
Novel biomaterials are needed to fill the demand of tailored bone substitutes required by an ever‐expanding array of surgical procedures and techniques. Wood, a natural fiber composite, modified with heat treatment to alter its composition, may provide a novel approach to the further development of hierarchically structured biomaterials. The suitability of wood as a model biomaterial as well as the effects of heat treatment on the osteoconductivity of wood was studied by placing untreated and heat‐treated (at 220 C , 200 degrees and 140 degrees for 2 h) birch implants (size 4 x 7mm) into drill cavities in the distal femur of rabbits. The follow‐up period was 4, 8 and 20 weeks in all in vivo experiments. The flexural properties of wood as well as dimensional changes and hydroxyl apatite formation on the surface of wood (untreated, 140 degrees C and 200 degrees C heat‐treated wood) were tested using 3‐point bending and compression tests and immersion in simulated body fluid. The effect of premeasurement grinding and the effect of heat treatment on the surface roughness and contour of wood were tested with contact stylus and non‐contact profilometry. The effects of heat treatment of wood on its interactions with biological fluids was assessed using two different test media and real human blood in liquid penetration tests. The results of the in vivo experiments showed implanted wood to be well tolerated, with no implants rejected due to foreign body reactions. Heat treatment had significant effects on the biocompatibility of wood, allowing host bone to grow into tight contact with the implant, with occasional bone ingrowth into the channels of the wood implant. The results of the liquid immersion experiments showed hydroxyl apatite formation only in the most extensively heat‐treated wood specimens, which supported the results of the in vivo experiments. Parallel conclusions could be drawn based on the results of the liquid penetration test where human blood had the most favorable interaction with the most extensively heat‐treated wood of the compared materials (untreated, 140 degrees C and 200 degrees C heat‐treated wood). The increasing biocompatibility was inferred to result mainly from changes in the chemical composition of wood induced by the heat treatment, namely the altered arrangement and concentrations of functional chemical groups. However, the influence of microscopic changes in the cell walls, surface roughness and contour cannot be totally excluded. The heat treatment was hypothesized to produce a functional change in the liquid distribution within wood, which could have biological relevance. It was concluded that the highly evolved hierarchical anatomy of wood could yield information for the future development of bulk bone substitutes according to the ideology of bioinspiration. Furthermore, the results of the biomechanical tests established that heat treatment alters various biologically relevant mechanical properties of wood, thus expanding the possibilities of wood as a model material, which could include e.g. scaffold applications, bulk bone applications and serving as a tool for both mechanical testing and for further development of synthetic fiber reinforced composites.
Resumo:
More and more innovations currently being commercialized exhibit network effects, in other words, the value of using the product increases as more and more people use the same or compatible products. Although this phenomenon has been the subject of much theoretical debate in economics, marketing researchers have been slow to respond to the growing importance of network effects in new product success. Despite an increase in interest in recent years, there is no comprehensive view on the phenomenon and, therefore, there is currently incomplete understanding of the dimensions it incorporates. Furthermore, there is wide dispersion in operationalization, in other words, the measurement of network effects, and currently available approaches have various shortcomings that limit their applicability, especially in marketing research. Consequently, little is known today about how these products fare on the marketplace and how they should be introduced in order to maximize their chances of success. Hence, the motivation for this study was driven by the need to increase our knowledge and understanding of the nature of network effects as a phenomenon, and of their role in the commercial success of new products. This thesis consists of two parts. The first part comprises a theoretical overview of the relevant literature, and presents the conclusions of the entire study. The second part comprises five complementary, empirical research publications. Quantitative research methods and two sets of quantitative data are utilized. The results of the study suggest that there is a need to update both the conceptualization and the operationalization of the phenomenon of network effects. Furthermore, there is a need for an augmented view on customers’ perceived value in the context of network effects, given that the nature of value composition has major implications for the viability of such products in the marketplace. The role of network effects in new product performance is not as straightforward as suggested in the existing theoretical literature. The overwhelming result of this study is that network effects do not directly influence product success, but rather enhance or suppress the influence of product introduction strategies. The major contribution of this study is in conceptualizing the phenomenon of network effects more comprehensively than has been attempted thus far. The study gives an augmented view of the nature of customer value in network markets, which helps in explaining why some products thrive on these markets whereas others never catch on. Second, the study discusses shortcomings in prior literature in the way it has operationalized network effects, suggesting that these limitations can be overcome in the research design. Third, the study provides some much-needed empirical evidence on how network effects, product introduction strategies, and new product performance are associated. In general terms, this thesis adds to our knowledge of how firms can successfully leverage network effects in product commercialization in order to improve market performance.
Resumo:
Carbohydrates are one of the most abundant classes of biomolecules on earth. In the initial stages of research on carbohydrates much effort was focused on investigation and determination of the structural aspects and complex nature of individual monosaccharides. Later on, development of protective group strategies and methods for oligosaccharide synthesis became the main topics of research. Today, the methodologies developed early on are being utilized in the production of carbohydrates for biological screening events. This multidisciplinary approach has generated the new discipline of glycobiology which focuses on research related to the appearance and biological significance of carbohydrates. In more detail, studies in glycobiology have revealed the essential roles of carbohydrates in cell-cell interactions, biological recognition events, protein folding, cell growth and tumor cell metastasis. As a result of these studies, carbohydrate derived diagnostic and therapeutic agents are likely to be of growing interest in the future. In this doctoral thesis, a journey through the fundamentals of carbohydrate synthesis is presented. The research conducted on this journey was neither limited to the study of any particular phenomena nor to the addressing of a single synthetic challenge. Instead, the focus was deliberately shifted from time to time in order to broaden the scope of the thesis, to continue the learning process and to explore new areas of carbohydrate research. Throughout the work, several previously reported synthetic protocols, especially procedures related to glycosylation reactions and protective group manipulations, were evaluated, modified and utilized or rejected. The synthetic molecules targeted within this thesis were either required for biological evaluations or utilized to study phenomena occuring in larger molecules. In addition, much effort was invested in the complete structural characterization of the synthesized compounds by a combination of NMR spectroscopic techniques and spectral simulations with the PERCH-software. This thesis provides the basics of working with carbohydrate chemistry. In more detail, synthetic strategies and experimental procedures for many different reactions and guidelines for the NMR-spectroscopic characterization of oligosaccharides and glycoconjugates are provided. Therefore, the thesis should prove valuable to researchers starting their own journeys in the ever expanding field of carbohydrate chemistry.
Resumo:
Earlier management studies have found a relationship between managerial qualities and subordinate impacts, but the effect of managers‘ social competence on leader perceptions has not been solidly established. To fill the related research gap, the present work embarks on a quantitative empirical effort to identify predictors of successful leadership. In particular, this study investigates relationships between perceived leader behavior and three selfreport instruments used to measure managerial capability: 1) the WOPI Work Personality Inventory, 2) Raven‘s general intelligence scale, and 3) the Emotive Communication Scale (ECS). This work complements previous research by resorting to both self-reports and other-reports: the results acquired from the managerial sample are compared to subordinate perceptions as measured through the ECS other-report and the WOPI360 multi-source appraisal. The quantitative research is comprised of a sample of 8o superiors and 354 subordinates operating in eight Finnish organizations. The strongest predictive value emerged from the ECS self- and other-reports and certain personality dimensions. In contrast, supervisors‘ logical intelligence did not correlate with leadership perceived as socially competent by subordinates. 16 of the superiors rated as most socially competent by their subordinates were selected for case analysis. Their qualitative narratives evidence the role of life history and post-traumatic growth in developing managerial skills. The results contribute to leadership theory in four ways. First, the ECS self-report devised for this research offers a reliable scale for predicting socially competent leader ability. Second, the work identifies dimensions of personality and emotive skills that can be considered predictors of managerial ability and benefited from in leader recruitment and career planning. Third, the Emotive Communication Model delineated on the basis of the empirical data allows for a systematic design and planning of communication and leadership education. Fourth, this workfurthers understanding of personal growth strategies and the role of life history in leader development and training. Finally, this research advances educational leadership by conceptualizing and operationalizing effective managerial communications. The Emotive Communication Model devised directs the pedagogic attention in engineering to assertion, emotional availability and inspiration skills. The proposed methodology addresses classroom management strategies drawing from problem-based learning, student empowerment, collaborative learning, and so-called socially competent teachership founded on teacher immediacy and perceived caring, all constituting strategies moving away from student compliance and teacher modelling. The ultimate educational objective embraces the development of individual engineers and organizational leaders that not only possess traditional analytical and technical expertise and substantive knowledge but are intelligent also creatively, practically, and socially.
Resumo:
The process of cold storage chambers contributes largely to the quality and longevity of stored products. In recent years, it has been intensified the study of control strategies in order to decrease the temperature change inside the storage chamber and to reduce the electric power consumption. This study has developed a system for data acquisition and process control, in LabVIEW language, to be applied in the cooling system of a refrigerating chamber of 30m³. The use of instrumentation and the application developed fostered the development of scientific experiments, which aimed to study the dynamic behavior of the refrigeration system, compare the performance of control strategies and the heat engine, even due to the controlled temperature, or to the electricity consumption. This system tested the strategies for on-off control, PID and fuzzy. Regarding power consumption, the fuzzy controller showed the best result, saving 10% when compared with other tested strategies.
Resumo:
In this thesis traditional investment strategies (value and growth) are compared to modern investment strategies (momentum, contrarian and GARP) in terms of risk, performance and cumulative returns. Strategies are compared during time period reaching from 1996 to 2010 in the Finnish stock market. Used data includes all listed main list stocks, dividends and is adjusted in case of splits, and mergers and acquisitions. Strategies are tested using different holding periods (6, 12 and 36 months) and data is divided into tercile portfolios based on different ranking criteria. Contrarian and growth strategies are the only strategies with improved cumulative returns when longer holding periods are used. Momentum (52-week high price1) and GARP strategies based on short holding period have the best performance and contrarian and growth strategies the worst. Momentum strategies (52-week high price) along with short holding period contrarian strategies (52-week low price2) have the lowest risk. Strategies with the highest risk are both growth strategies and two momentum strategies (52-week low price). The empirical results support the efficiency of momentum, GARP and value strategies. The least efficient strategies are contrarian and growth strategies in terms of risk, performance and cumulative returns. Most strategies outperform the market portfolio in all three measures. 1 Stock ranking criterion (current price/52-week highest price) 2 Stock ranking criterion (current price/52-week lowest price)
Resumo:
Gastrointestinal stromal tumors account for 0.1 to 3% of all resected gastric tumors and are the most common submucosal mass found in the stomach. Preoperative diagnosis is often difficult; consequently surgery is the best and only option on most cases. There are studies with different surgery techniques based on tumors location. The reported case led us at literature review with the intent of establishing preoperative diagnosis, therapeutic strategies and prognosis.
Resumo:
Formal software development processes and well-defined development methodologies are nowadays seen as the definite way to produce high-quality software within time-limits and budgets. The variety of such high-level methodologies is huge ranging from rigorous process frameworks like CMMI and RUP to more lightweight agile methodologies. The need for managing this variety and the fact that practically every software development organization has its own unique set of development processes and methods have created a profession of software process engineers. Different kinds of informal and formal software process modeling languages are essential tools for process engineers. These are used to define processes in a way which allows easy management of processes, for example process dissemination, process tailoring and process enactment. The process modeling languages are usually used as a tool for process engineering where the main focus is on the processes themselves. This dissertation has a different emphasis. The dissertation analyses modern software development process modeling from the software developers’ point of view. The goal of the dissertation is to investigate whether the software process modeling and the software process models aid software developers in their day-to-day work and what are the main mechanisms for this. The focus of the work is on the Software Process Engineering Metamodel (SPEM) framework which is currently one of the most influential process modeling notations in software engineering. The research theme is elaborated through six scientific articles which represent the dissertation research done with process modeling during an approximately five year period. The research follows the classical engineering research discipline where the current situation is analyzed, a potentially better solution is developed and finally its implications are analyzed. The research applies a variety of different research techniques ranging from literature surveys to qualitative studies done amongst software practitioners. The key finding of the dissertation is that software process modeling notations and techniques are usually developed in process engineering terms. As a consequence the connection between the process models and actual development work is loose. In addition, the modeling standards like SPEM are partially incomplete when it comes to pragmatic process modeling needs, like light-weight modeling and combining pre-defined process components. This leads to a situation, where the full potential of process modeling techniques for aiding the daily development activities can not be achieved. Despite these difficulties the dissertation shows that it is possible to use modeling standards like SPEM to aid software developers in their work. The dissertation presents a light-weight modeling technique, which software development teams can use to quickly analyze their work practices in a more objective manner. The dissertation also shows how process modeling can be used to more easily compare different software development situations and to analyze their differences in a systematic way. Models also help to share this knowledge with others. A qualitative study done amongst Finnish software practitioners verifies the conclusions of other studies in the dissertation. Although processes and development methodologies are seen as an essential part of software development, the process modeling techniques are rarely used during the daily development work. However, the potential of these techniques intrigues the practitioners. As a conclusion the dissertation shows that process modeling techniques, most commonly used as tools for process engineers, can also be used as tools for organizing the daily software development work. This work presents theoretical solutions for bringing the process modeling closer to the ground-level software development activities. These theories are proven feasible by presenting several case studies where the modeling techniques are used e.g. to find differences in the work methods of the members of a software team and to share the process knowledge to a wider audience.