64 resultados para lot sizing and scheduling
Resumo:
IT outsourcing refers to the way companies focus on their core competencies and buy the supporting functions from other companies specialized in that area. Service is the total outcome of numerous of activities by employees and other resources to provide solutions to customers' problems. Outsourcing and service business have their unique characteristics. Service Level Agreements quantify the minimum acceptable service to the user. The service quality has to be objectively quantified so that its achievement or non-achievement of it can be monitored. Usually offshoring refers to the transferring of tasks to low-cost nations. Offshoring presents a lot of challenges that require special attention and they need to be assessed thoroughly. IT Infrastructure management refers to installation and basic usability assistance of operating systems, network and server tools and utilities. ITIL defines the industry best practices for organizing IT processes. This thesis did an analysis of server operations service and the customers’ perception of the quality of daily operations. The agreed workflows and processes should be followed better. Service providers’ processes are thoroughly defined but both the customer and the service provider might disobey them. Service provider should review the workflows regarding customer functions. Customer facing functions require persistent skill development, as they communicate the quality to the customer. Service provider needs to provide better organized communication and knowledge exchange methods between the specialists in different geographical locations.
The relationship between a virtual leader’s communication practices and a virtual team’s performance
Resumo:
A lot of research has been carried out into virtual teams and virtual leadership, yet there is hardly any research available on the communication behaviour of virtual leaders within a real business context. This research assessed the communication practices of virtual leaders and analysed the relationship between these practices and the performance of virtual teams. The objective of this research was to examine the distinctions of virtual teams, to study the leader’s role in a virtual team and its performance, and to examine the leader’s communication practices within virtual teams. The research involves a case study in which interviews have been carried out within an international technology company headquartered in Finland. Qualitative research methods were applied in the research. Based on the results of the study it can be said that there is a strong relationship between a virtual leader’s communication practices and a virtual team member’s job satisfaction. Through their communication practices, activities and message contents, leaders can affect the job satisfaction of virtual team members. In virtual leadership the focus is not in virtual but in leadership. It does not matter if the context is virtual or face-to-face; similar communication practices are good in both cases. As the global economic crisis strongly affected the sales results of the between a leader’s communication practices and a virtual team’s objective performance cannot be made.
Resumo:
Browsing the web has become one of the most important features in high end mobile phones and in the future more and more people will be using mobile phone for web browsing. Large touchscreens improve browsing experience but many web sites are designed to be used with a mouse. A touchscreen differs substantially from a mouse as a pointing device and therefore mouse emulation logic is required in the browsers to make more web sites usable. This Master's thesis lists the most significant cases where the differences of a mouse and a touchscreen affect web browsing. Five touchscreen mobile phones and their web browsers were evaluated to find out if and how these cases are handled in them. Also as a part of this thesis, a simple QtWebKit based mobile web browser with advanced mouse emulation model was implemented, aiming to solve all the problematic cases. The conclusion of this work is that it is feasible to emulate a mouse with a touchscreen and thus deliver good user experience in mobile web browsing. However, current highend touchscreen mobile phones have relatively underdeveloped mouse emulations in their web browsers and there is a lot to improve.
Resumo:
In recent years, the network vulnerability to natural hazards has been noticed. Moreover, operating on the limits of the network transmission capabilities have resulted in major outages during the past decade. One of the reasons for operating on these limits is that the network has become outdated. Therefore, new technical solutions are studied that could provide more reliable and more energy efficient power distributionand also a better profitability for the network owner. It is the development and price of power electronics that have made the DC distribution an attractive alternative again. In this doctoral thesis, one type of a low-voltage DC distribution system is investigated. Morespecifically, it is studied which current technological solutions, used at the customer-end, could provide better power quality for the customer when compared with the current system. To study the effect of a DC network on the customer-end power quality, a bipolar DC network model is derived. The model can also be used to identify the supply parameters when the V/kW ratio is approximately known. Although the model provides knowledge of the average behavior, it is shown that the instantaneous DC voltage ripple should be limited. The guidelines to choose an appropriate capacitance value for the capacitor located at the input DC terminals of the customer-end are given. Also the structure of the customer-end is considered. A comparison between the most common solutions is made based on their cost, energy efficiency, and reliability. In the comparison, special attention is paid to the passive filtering solutions since the filter is considered a crucial element when the lifetime expenses are determined. It is found out that the filter topology most commonly used today, namely the LC filter, does not provide economical advantage over the hybrid filter structure. Finally, some of the typical control system solutions are introduced and their shortcomings are presented. As a solution to the customer-end voltage regulation problem, an observer-based control scheme is proposed. It is shown how different control system structures affect the performance. The performance meeting the requirements is achieved by using only one output measurement, when operating in a rigid network. Similar performance can be achieved in a weak grid by DC voltage measurement. An additional improvement can be achieved when an adaptive gain scheduling-based control is introduced. As a conclusion, the final power quality is determined by a sum of various factors, and the thesis provides the guidelines for designing the system that improves the power quality experienced by the customer.
Resumo:
This work contains a series of studies on the optimization of three real-world scheduling problems, school timetabling, sports scheduling and staff scheduling. These challenging problems are solved to customer satisfaction using the proposed PEAST algorithm. The customer satisfaction refers to the fact that implementations of the algorithm are in industry use. The PEAST algorithm is a product of long-term research and development. The first version of it was introduced in 1998. This thesis is a result of a five-year development of the algorithm. One of the most valuable characteristics of the algorithm has proven to be the ability to solve a wide range of scheduling problems. It is likely that it can be tuned to tackle also a range of other combinatorial problems. The algorithm uses features from numerous different metaheuristics which is the main reason for its success. In addition, the implementation of the algorithm is fast enough for real-world use.
Resumo:
Supply chains are becoming increasingly dependent on information ex-change in today’s world, and any disruption can cause severe repercus-sions to the flow of materials in the chain. The speed, accuracy and amount of information are key factors. The aim in this thesis is to address a gap in the research by focusing on information exchange and the risks related to it in a multimodal wood supply chain operating between the Baltic States and Finland. The study involved interviewing people engaged in logistics management in the supply chain in question. The main risk the interviewees identified arose from the sea logistics system, which held a lot of different kinds of information. The threat of breakdown in the Internet connection was also found to hinder the operations significantly. A vulnerability analysis was carried out in order to identify the main actors and channels of infor-mation flow in the supply chain. The analysis revealed that the most important and therefore most vulnerable information-exchange channels were those linking the terminal superintendent, the operative managers and the mill managers. The study gives a holistic picture of the investigated supply chain. Information-exchange-related risks varied greatly. One of the most frequently mentioned was the risk of information inaccuracy, which was usually due to the fact that those in charge of the various functions did not fully understand the consequences for the entire chain.
Resumo:
The Electronic Government (e-Government) means delivering the services and information to the citizens and businesses through the use of Information and Communication Technology (ICT) in order to enable them to interact more effectively with the government, and to increase the quality of the services. As many other governments in the developed and developing countries, the Kurdistan Regional Government (KRG) has embarked on the e-Government initiatives. This study revealed that there are various challenges which affect the e-Government in the Kurdistan Region of Iraq (KRI), but also a lot of e-Government progress has happened. In addition, based on the United Nations’ e-Government maturity level benchmarking, the e-Government in the KRI is at the interactive stage. In this study the services that the citizens want from the government in order to implement an appropriate e-Government were also identified.
Resumo:
Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.
Resumo:
The aim of this study was to develop a theoretical model for information integration to support the deci¬sion making of intensive care charge nurses, and physicians in charge – that is, ICU shift leaders. The study focused on the ad hoc decision-making and immediate information needs of shift leaders during the management of an intensive care unit’s (ICU) daily activities. The term ‘ad hoc decision-making’ was defined as critical judgements that are needed for a specific purpose at a precise moment with the goal of ensuring instant and adequate patient care and a fluent flow of ICU activities. Data collection and research analysis methods were tested in the identification of ICU shift leaders’ ad hoc decision-making. Decision-making of ICU charge nurses (n = 12) and physicians in charge (n = 8) was observed using a think-aloud technique in two university-affiliated Finnish ICUs for adults. The ad hoc decisions of ICU shift leaders were identified using an application of protocol analysis. In the next phase, a structured online question¬naire was developed to evaluate the immediate information needs of ICU shift leaders. A national survey was conducted in all Finnish, university-affiliated hospital ICUs for adults (n = 17). The questionnaire was sent to all charge nurses (n = 515) and physicians in charge (n = 223). Altogether, 257 charge nurses (50%) and 96 physicians in charge (43%) responded to the survey. The survey was also tested internationally in 16 Greek ICUs. From Greece, 50 charge nurses out of 240 (21%) responded to the survey. A think-aloud technique and protocol analysis were found to be applicable for the identification of the ad hoc decision-making of ICU shift leaders. During one day shift leaders made over 200 ad hoc decisions. Ad hoc decisions were made horizontally, related to the whole intensive care process, and vertically, concerning single intensive care incidents. Most of the ICU shift leaders’ ad hoc decisions were related to human resources and know-how, patient information and vital signs, and special treatments. Commonly, this ad hoc decision-making involved several multiprofessional decisions that constituted a bundle of immediate decisions and various information needs. Some of these immediate information needs were shared between the charge nurses and the physicians in charge. The majority of which concerned patient admission, the organisation and management of work, and staff allocation. In general, the information needs of charge nurses were more varied than those of physicians. It was found that many ad hoc deci-sions made by the physicians in charge produced several information needs for ICU charge nurses. This meant that before the task at hand was completed, various kinds of information was sought by the charge nurses to support the decision-making process. Most of the immediate information needs of charge nurses were related to the organisation and management of work and human resources, whereas the information needs of the physicians in charge mainly concerned direct patient care. Thus, information needs differ between professionals even if the goal of decision-making is the same. The results of the international survey confirmed these study results for charge nurses. Both in Finland and in Greece the information needs of charge nurses focused on the organisation and management of work and human resources. Many of the most crucial information needs of Finnish and Greek ICU charge nurses were common. In conclusion, it was found that ICU shift leaders make hundreds of ad hoc decisions during the course of a day related to the allocation of resources and organisation of patient care. The ad hoc decision-making of ICU shift leaders is a complex multi-professional process, which requires a lot of immediate information. Real-time support for information related to patient admission, the organisation and man¬agement of work, and allocation of staff resources is especially needed. The preliminary information integration model can be applied when real-time enterprise resource planning systems are developed for intensive care daily management
Resumo:
Long-term independent budget travel to countries far away has become increasingly common over the last few decades, and backpacking has now entered the tourism mainstream. Nowadays, backpackers are a very important segment of the global travel market. Backpacking is a type of tourism that involves a lot of information search activities. The Internet has become a major source of information as well as a platform for tourism business transactions. It allows travelers to gain information very effortlessly and to learn about tourist destinations and products directly from other travelers in the form of electronic word-of-mouth (eWOM). Social media has penetrated and changed the backpacker market, as now modern travelers can stay connected to people at home, read online recommendations, and organize and book their trips very independently. In order to create a wider understanding on modern-day backpackers and their information search and share behavior in the Web 2.0 era, this thesis examined contemporary backpackers and their use of social media as an information and communication platform. In order to achieve this goal, three sub-objectives were identified: 1. to describe contemporary backpacker tourism 2. to examine contemporary backpackers’ travel information search and share behavior 3. to explore the impacts of new information and communications technologies and Web 2.0 on backpacker tourism The empirical data was gathered with an online survey, thus the method of analysis was mainly quantitative, and a qualitative method was used for a brief analysis of open questions. The research included both descriptive and analytical approaches, as the goal was to describe modern-day backpackers, and to examine possible interdependencies between information search and share behavior and background variables. The interdependencies were tested for statistical significance with the help of five research hypotheses. The results suggested that backpackers no longer fall under the original backpacker definitions described some decades ago. Now, they are mainly short-term travelers, whose trips resemble more those of mainstream tourists. They use communication technologies very actively, and particularly social media. Traditional information sources, mainly guide books and recommendations from friends, are of great importance to them but also eWOM sources are widely used in travel decision making. The use of each source varies according to the stage of the trip. All in all, Web 2.0 and new ICTs have transformed the backpacker tourism industry in many ways. Although the experience has become less authentic in some travelers’ eyes, the backpacker culture is still recognizable.
Resumo:
In a just-in-time, assemble-to-order production environments the scheduling of material requirements and production tasks - even though difficult - is of paramount importance. Different enterprise resource planning solutions with master scheduling functionality have been created to ease this problem and work as expected unless there is a problem in the material flow. This case-based candidate’s thesis introduces a tool for Microsoft Dynamics AX multisite environment, that can be used by site managers and production coordinators to get an overview of the current open sales order base and prioritize production in the event of material shortouts to avoid part-deliveries.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
The advancement of science and technology makes it clear that no single perspective is any longer sufficient to describe the true nature of any phenomenon. That is why the interdisciplinary research is gaining more attention overtime. An excellent example of this type of research is natural computing which stands on the borderline between biology and computer science. The contribution of research done in natural computing is twofold: on one hand, it sheds light into how nature works and how it processes information and, on the other hand, it provides some guidelines on how to design bio-inspired technologies. The first direction in this thesis focuses on a nature-inspired process called gene assembly in ciliates. The second one studies reaction systems, as a modeling framework with its rationale built upon the biochemical interactions happening within a cell. The process of gene assembly in ciliates has attracted a lot of attention as a research topic in the past 15 years. Two main modelling frameworks have been initially proposed in the end of 1990s to capture ciliates’ gene assembly process, namely the intermolecular model and the intramolecular model. They were followed by other model proposals such as templatebased assembly and DNA rearrangement pathways recombination models. In this thesis we are interested in a variation of the intramolecular model called simple gene assembly model, which focuses on the simplest possible folds in the assembly process. We propose a new framework called directed overlap-inclusion (DOI) graphs to overcome the limitations that previously introduced models faced in capturing all the combinatorial details of the simple gene assembly process. We investigate a number of combinatorial properties of these graphs, including a necessary property in terms of forbidden induced subgraphs. We also introduce DOI graph-based rewriting rules that capture all the operations of the simple gene assembly model and prove that they are equivalent to the string-based formalization of the model. Reaction systems (RS) is another nature-inspired modeling framework that is studied in this thesis. Reaction systems’ rationale is based upon two main regulation mechanisms, facilitation and inhibition, which control the interactions between biochemical reactions. Reaction systems is a complementary modeling framework to traditional quantitative frameworks, focusing on explicit cause-effect relationships between reactions. The explicit formulation of facilitation and inhibition mechanisms behind reactions, as well as the focus on interactions between reactions (rather than dynamics of concentrations) makes their applicability potentially wide and useful beyond biological case studies. In this thesis, we construct a reaction system model corresponding to the heat shock response mechanism based on a novel concept of dominance graph that captures the competition on resources in the ODE model. We also introduce for RS various concepts inspired by biology, e.g., mass conservation, steady state, periodicity, etc., to do model checking of the reaction systems based models. We prove that the complexity of the decision problems related to these properties varies from P to NP- and coNP-complete to PSPACE-complete. We further focus on the mass conservation relation in an RS and introduce the conservation dependency graph to capture the relation between the species and also propose an algorithm to list the conserved sets of a given reaction system.
Resumo:
The focus of the work reported in this thesis was to study and to clarify the effect of polyelectrolyte multilayer surface treatment on inkjet ink spreading, absorption and print quality. Surface sizing with a size press, film press with a pilot scale coater, and spray coating, have been used to surface treat uncoated wood-free, experimental wood-free and pigmentcoated substrates. The role of the deposited cationic (polydiallydimethylammonium chloride, PDADMAC) and anionic (sodium carboxymethyl cellulose, NaCMC) polyelectrolyte layers with and without nanosilica, on liquid absorption and spreading was studied in terms of their interaction with water-based pigmented and dye-based inkjet inks. Contact angle measurements were made in attempt to explain the ink spreading and wetting behavior on the substrate. First, it was noticed that multilayer surface treatment decreased the contact angle of water, giving a hydrophilic character to the surface. The results showed that the number of cationic-anionic polyelectrolyte layers or the order of deposition of the polyelectrolytes had a significant effect on the print quality. This was seen for example as a higher print density on layers with a cationic polyelectrolyte in the outermost layer. The number of layers had an influence on the print quality; the print density increased with increasing number of layers, although the increase was strongly dependent on ink formulation and chemistry. The use of nanosilica clearly affected the rate of absorption of polar liquids, which also was seen as a higher density of the black dye-based print. Slightly unexpected, the use of nanosilica increased the tendency for lateral spreading of both the pigmented and dye-based inks. It was shown that the wetting behavior and wicking of the inks on the polyelectrolyte coatings was strongly affected by the hydrophobicity of the substrate, as well as by the composition or structure of the polyelectrolyte layers. Coating only with a cationic polyelectrolyte was not sufficient to improve dye fixation, but it was demonstrated that a cationic-anionic-complex structure led to good water fastness. A threelayered structure gave the same water fastness values as a five-layered structure. Interestingly, the water fastness values were strongly dependent not only on the formed cation-anion polyelectrolyte complexes but also on the tendency of the coating to dissolve during immersion in water. Results showed that by optimizing the chemistry of the layers, the ink-substrate interaction can be optimized.
Resumo:
The general aim of the thesis was to study university students’ learning from the perspective of regulation of learning and text processing. The data were collected from the two academic disciplines of medical and teacher education, which share the features of highly scheduled study, a multidisciplinary character, a complex relationship between theory and practice and a professional nature. Contemporary information society poses new challenges for learning, as it is not possible to learn all the information needed in a profession during a study programme. Therefore, it is increasingly important to learn how to think and learn independently, how to recognise gaps in and update one’s knowledge and how to deal with the huge amount of constantly changing information. In other words, it is critical to regulate one’s learning and to process text effectively. The thesis comprises five sub-studies that employed cross-sectional, longitudinal and experimental designs and multiple methods, from surveys to eye tracking. Study I examined the connections between students’ study orientations and the ways they regulate their learning. In total, 410 second-, fourth- and sixth-year medical students from two Finnish medical schools participated in the study by completing a questionnaire measuring both general study orientations and regulation strategies. The students were generally deeply oriented towards their studies. However, they regulated their studying externally. Several interesting and theoretically reasonable connections between the variables were found. For instance, self-regulation was positively correlated with deep orientation and achievement orientation and was negatively correlated with non-commitment. However, external regulation was likewise positively correlated with deep orientation and achievement orientation but also with surface orientation and systematic orientation. It is argued that external regulation might function as an effective coping strategy in the cognitively loaded medical curriculum. Study II focused on medical students’ regulation of learning and their conceptions of the learning environment in an innovative medical course where traditional lectures were combined wth problem-based learning (PBL) group work. First-year medical and dental students (N = 153) completed a questionnaire assessing their regulation strategies of learning and views about the PBL group work. The results indicated that external regulation and self-regulation of the learning content were the most typical regulation strategies among the participants. In line with previous studies, self-regulation wasconnected with study success. Strictly organised PBL sessions were not considered as useful as lectures, although the students’ views of the teacher/tutor and the group were mainly positive. Therefore, developers of teaching methods are challenged to think of new solutions that facilitate reflection of one’s learning and that improve the development of self-regulation. In Study III, a person-centred approach to studying regulation strategies was employed, in contrast to the traditional variable-centred approach used in Study I and Study II. The aim of Study III was to identify different regulation strategy profiles among medical students (N = 162) across time and to examine to what extent these profiles predict study success in preclinical studies. Four regulation strategy profiles were identified, and connections with study success were found. Students with the lowest self-regulation and with an increasing lack of regulation performed worse than the other groups. As the person-centred approach enables us to individualise students with diverse regulation patterns, it could be used in supporting student learning and in facilitating the early diagnosis of learning difficulties. In Study IV, 91 student teachers participated in a pre-test/post-test design where they answered open-ended questions about a complex science concept both before and after reading either a traditional, expository science text or a refutational text that prompted the reader to change his/her beliefs according to scientific beliefs about the phenomenon. The student teachers completed a questionnaire concerning their regulation and processing strategies. The results showed that the students’ understanding improved after text reading intervention and that refutational text promoted understanding better than the traditional text. Additionally, regulation and processing strategies were found to be connected with understanding the science phenomenon. A weak trend showed that weaker learners would benefit more from the refutational text. It seems that learners with effective learning strategies are able to pick out the relevant content regardless of the text type, whereas weaker learners might benefit from refutational parts that contrast the most typical misconceptions with scientific views. The purpose of Study V was to use eye tracking to determine how third-year medical studets (n = 39) and internal medicine residents (n = 13) read and solve patient case texts. The results revealed differences between medical students and residents in processing patient case texts; compared to the students, the residents were more accurate in their diagnoses and processed the texts significantly faster and with a lower number of fixations. Different reading patterns were also found. The observed differences between medical students and residents in processing patient case texts could be used in medical education to model expert reasoning and to teach how a good medical text should be constructed. The main findings of the thesis indicate that even among very selected student populations, such as high-achieving medical students or student teachers, there seems to be a lot of variation in regulation strategies of learning and text processing. As these learning strategies are related to successful studying, students enter educational programmes with rather different chances of managing and achieving success. Further, the ways of engaging in learning seldom centre on a single strategy or approach; rather, students seem to combine several strategies to a certain degree. Sometimes, it can be a matter of perspective of which way of learning can be considered best; therefore, the reality of studying in higher education is often more complicated than the simplistic view of self-regulation as a good quality and external regulation as a harmful quality. The beginning of university studies may be stressful for many, as the gap between high school and university studies is huge and those strategies that were adequate during high school might not work as well in higher education. Therefore, it is important to map students’ learning strategies and to encourage them to engage in using high-quality learning strategies from the beginning. Instead of separate courses on learning skills, the integration of these skills into course contents should be considered. Furthermore, learning complex scientific phenomena could be facilitated by paying attention to high-quality learning materials and texts and other support from the learning environment also in the university. Eye tracking seems to have great potential in evaluating performance and growing diagnostic expertise in text processing, although more research using texts as stimulus is needed. Both medical and teacher education programmes and the professions themselves are challenging in terms of their multidisciplinary nature and increasing amounts of information and therefore require good lifelong learning skills during the study period and later in work life.