140 resultados para development methods
Resumo:
The purpose of this study was to expand the applicability of supplier segmentation and development approaches to the project-driven construction industry. These practices are less exploited and not well documented in this operational environment compared to the process-centric manufacturing industry. At first, portfolio models to supply base segmentation and various supplier development efforts were investigated in literature review. A step-wise framework was structured for the empirical research. The empirical study employed multiple research methods in three case studies in a large Finnish construction company. The first study categorized the construction item classes into the purchasing portfolio and positioned suppliers to the power matrix by investigating buyer-supplier relations. Using statistical tests, the study also identified factors that affect suppliers’ performance. The final case study identified improvement areas of the interface between a main contractor and one if its largest suppliers. The final results indicate that only by assessing the supply base in a holistic manner and the power circumstances in it, buyers comprehend how to best establish appropriate supplier development strategies in the project environment.
Resumo:
Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.
Resumo:
Virtual environments and real-time simulators (VERS) are becoming more and more important tools in research and development (R&D) process of non-road mobile machinery (NRMM). The virtual prototyping techniques enable faster and more cost-efficient development of machines compared to use of real life prototypes. High energy efficiency has become an important topic in the world of NRMM because of environmental and economic demands. The objective of this thesis is to develop VERS based methods for research and development of NRMM. A process using VERS for assessing effects of human operators on the life-cycle efficiency of NRMM was developed. Human in the loop simulations are ran using an underground mining loader to study the developed process. The simulations were ran in the virtual environment of the Laboratory of Intelligent Machines of Lappeenranta University of Technology. A physically adequate real-time simulation model of NRMM was shown to be reliable and cost effective in testing of hardware components by the means of hardware-in-the-loop (HIL) simulations. A control interface connecting integrated electro-hydraulic energy converter (IEHEC) with virtual simulation model of log crane was developed. IEHEC consists of a hydraulic pump-motor and an integrated electrical permanent magnet synchronous motorgenerator. The results show that state of the art real-time NRMM simulators are capable to solve factors related to energy consumption and productivity of the NRMM. A significant variation between the test drivers is found. The results show that VERS can be used for assessing human effects on the life-cycle efficiency of NRMM. HIL simulation responses compared to that achieved with conventional simulation method demonstrate the advances and drawbacks of various possible interfaces between the simulator and hardware part of the system under study. Novel ideas for arranging the interface are successfully tested and compared with the more traditional one. The proposed process for assessing the effects of operators on the life-cycle efficiency will be applied for wider group of operators in the future. Driving styles of the operators can be analysed statistically from sufficient large result data. The statistical analysis can find the most life-cycle efficient driving style for the specific environment and machinery. The proposed control interface for HIL simulation need to be further studied. The robustness and the adaptation of the interface in different situations must be verified. The future work will also include studying the suitability of the IEHEC for different working machines using the proposed HIL simulation method.
Resumo:
This thesis studies the possibility to use lean tools and methods in a quotation process which is carried out in an office environment. The aim of the study was to find out and test the relevant lean tools and methods which can help to balance and standardize the quotation process, and reduce the variance in quotation lead times and in quality. Seminal works, researches and guide books related to the topic were used as the basis for the theory development. Based on the literature review and the case company’s own lean experience, the applicable lean tools and methods were selected to be tested by a sales support team. Leveling production, by product categorization and value stream mapping, was a key method to be used to balance the quotation process. 5S method was started concurrently for standardizing the work. Results of the testing period showed that lean tools and methods are applicable in office process and selected tools and methods helped to balance and standardize the quotation process. Case company’s sales support team decided to implement new lean based quotation process model.
Resumo:
Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.
Resumo:
This doctoral dissertation explores the contribution of environmental management practices, the so-called clean development mechanism (CDM) projects, and foreign direct investment (FDI) in achieving sustainable development in developing countries, particularly in Sub- Saharan Africa. Because the climate change caused by greenhouse gas emissions is one of the most serious global environmental challenges, the main focus is on the causal links between carbon dioxide (CO2) emissions, energy consumption, and economic development in Sub-Saharan Africa. In addition, the dissertation investigates the factors that have affected the distribution of CDM projects in developing countries and the relationships between FDI and other macroeconomic variables of interest. The main contribution of the dissertation is empirical. One of the publications uses crosssectional data and Tobit and Poisson regressions. Three of the studies use time-series data and vector autoregressive and vector error correction models, while two publications use panel data and panel data estimation methods. One of the publications uses thus both timeseries and panel data. The concept of Granger causality is utilized in four of the publications. The results indicate that there are significant differences in the Granger causality relationships between CO2 emissions, energy consumption, economic growth, and FDI in different countries. It appears also that the causality relationships change over time. Furthermore, the results support the environmental Kuznets curve hypothesis but only for some of the countries. As to CDM activities, past emission levels, institutional quality, and the size of the host country appear to be among the significant determinants of the distribution of CDM projects. FDI and exports are also found to be significant determinants of economic growth.
Resumo:
The importance of university-company collaboration has increased during the last decades. The drivers for that are, on the one hand, changes in business logic of companies and on the other hand the decreased state funding of universities. Many companies emphasize joint research with universities as an enabling input to their development processes, which aim at creating new innovations, products and wealth. These factors have changed universities’ operations and they have adopted several practices of dynamic business organizations, such as strategic planning, monitoring and controlling methods of internal processes etc. The objective of this thesis is to combine different characteristics of successful university-company partnership and its development. The development process starts with identifying potential partners in the university’s interest group, which requires understanding the role of different partners in the innovation system. Next, in order to find a common development basis, matching the policy and strategy between partners is needed. The third phase is to combine the academic and industrial objectives of a joint project, which is a typical form of university-company collaboration. The optimum is a win-win situation where both partners, universities and companies, can get addedvalue. For the companies added value typically means access to new research results before their competitors. For the universities added value offers a possibility to carry on high level scientific work. The research output in the form of published scientific articles is evaluated by the international science community. Because the university-company partnership is often executed by joint projects, the different forms of this kind of projects is discussed in this study. The most challenging form of collaboration is a semi-open project model, which is not based on bilateral activities between universities and companies but on a consortium of several universities, research institutes and companies. The universities and companies are core actors in the innovation system. Thus the discussion of their roles and relations to public operators like publicly funded financiers is important. In the Finnish innovation system there are at least the following doers executing strategies and policies: EU, Academy of Finland and TEKES. In addition to these, Strategic Centres for Science, Technology and Innovation which are owned jointly by companies, universities and research organizations have a very important role in their fields of business. They transfer research results into commercial actions to generate wealth. The thesis comprises two parts. The first part consists of an overview of the study including introduction, literature review, research design, synthesis of findings and conclusions. The second part introduces four original research publications.
Resumo:
The liberalisation of the wholesale electricity markets has been considered an efficient way to organise the markets. In Europe, the target is to liberalise and integrate the common European electricity markets. However, insufficient transmission capacity between the market areas hampers the integration, and therefore, new investments are required. Again, massive transmission capacity investments are not usually easy to carry through. This doctoral dissertation aims at elaborating on critical determinants required to deliver the necessary transmission capacity investments. The Nordic electricity market is used as an illustrative example. This study suggests that changes in the governance structure have affected the delivery of Nordic cross-border investments. In addition, the impacts of not fully delivered investments are studied in this doctoral dissertation. An insufficient transmission network can degrade the market uniformity and may also cause a need to split the market into smaller submarkets. This may have financial impacts on market actors when the targeted efficient sharing of resources is not met and even encourage gaming. The research methods applied in this doctoral dissertation are mainly empirical ranging from a Delphi study to case studies and numerical calculations.
Resumo:
Product assurance is an essential part of product development process if developers want to ensure that final product is safe and reliable. Product assurance can be supported withrisk management and with different failure analysis methods. Product assurance is emphasized in system development process of mission critical systems. The product assurance process in systems of this kind requires extra attention. Inthis thesis, mission critical systems are space systems and the product assurance processof these systems is presented with help of space standards. The product assurance process can be supported with agile development because agile emphasizes transparency of the process and fast response to changes. Even if the development process of space systems is highly standardized and reminds waterfall model, it is still possible to adapt agile development in space systems development. This thesisaims to support the product assurance process of space systems with agile developmentso that the final product would be as safe and reliable as possible. The main purpose of this thesis is to examine how well product assurance is performed in Finnish space organizations and how product assurance tasks and activities can besupported with agile development. The research part of this thesis is performed in survey form.
Resumo:
Product assurance is an essential part of product development process if developers want to ensure that final product is safe and reliable. Product assurance can be supported with risk management and with different failure analysis methods. Product assurance is emphasized in system development process of mission critical systems. The product assurance process in systems of this kind requires extra attention. In this thesis, mission critical systems are space systems and the product assurance process of these systems is presented with help of space standards. The product assurance process can be supported with agile development because agile emphasizes transparency of the process and fast response to changes. Even if the development process of space systems is highly standardized and reminds waterfall model, it is still possible to adapt agile development in space systems development. This thesis aims to support the product assurance process of space systems with agile development so that the final product would be as safe and reliable as possible. The main purpose of this thesis is to examine how well product assurance is performed in Finnish space organizations and how product assurance tasks and activities can be supported with agile development. The research part of this thesis is performed in survey form.
Resumo:
In this doctoral thesis, a tomographic STED microscopy technique for 3D super-resolution imaging was developed and utilized to observebone remodeling processes. To improve upon existing methods, wehave used a tomographic approach using a commercially available stimulated emission depletion (STED) microscope. A certain region of interest (ROI) was observed at two oblique angles: one at a standard inverted configuration from below (bottom view) and another from the side (side view) via a micro-mirror positioned close to the ROI. The two viewing angles were reconstructed into a final tomogram. The technique, named as tomographic STED microscopy, was able to achieve an axial resolution of approximately 70 nm on microtubule structures in a fixed biological specimen. High resolution imaging of osteoclasts (OCs) that are actively resorbing bone was achieved by creating an optically transparent coating on a microscope coverglass that imitates a fractured bone surface. 2D super-resolution STED microscopy on the bone layer showed approximately 60 nm of lateral resolution on a resorption associated organelle allowing these structures to be imaged with super-resolution microscopy for the first time. The developed tomographic STED microscopy technique was further applied to study resorption mechanisms of OCs cultured on the bone coating. The technique revealed actin cytoskeleton with specific structures, comet-tails, some of which were facing upwards and some others were facing downwards. This, in our opinion, indicated that during bone resorption, an involvement of the actin cytoskeleton in vesicular exocytosis and endocytosis is present. The application of tomographic STED microscopy in bone biology demonstrated that 3D super-resolution techniques can provide new insights into biological 3D nano-structures that are beyond the diffraction-limit when the optical constraints of super-resolution imaging are carefully taken into account.
Resumo:
The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.
Resumo:
Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.
Resumo:
This thesis focuses on consolidation the recommendations on the integration of consumer in new product development (NPD) given in the academic literature, and on the example of the three NPD projects in the case company. The empirical findings advocate that the case company fulfills the principles of consumer-led NPD, and it is only one-step away of the full consumer empowerment strategy. Therefore, its NPD can be seen as an example of consumer-led NPD implementation. The findings also suggest that the product can be developed in consumer-led way regardless of the source of an idea (product- or need-driven), the target audience and resources assigned, in case when consumer mindset is integrated on all levels of organisation: strategic, cultural, operational and process. It is possible with top-management commitment, internal consumer research group, and the sophisticated consumer research methods. The specific managerial recommendations are given on developing consumer-led culture, strategy, NPD process and the appropriate consumer research methods and techniques.
Resumo:
The aim of this study was to explore adherence to treatment among people with psychotic disorders through the development of user-centered mobile technology (mHealth) intervention. More specifically, this study investigates treatment adherence as well as mHealth intervention and the factors related to its possible usability. The data were collected from 2010 to 2013. First, patients’ and professionals’ perceptions of adherence management and restrictive factors of adherence were described (n = 61). Second, objectives and methods of the intervention were defined based on focus group interviews and previously used methods. Third, views of patients and professionals about barriers and requirements of the intervention were described (n = 61). Fourth, mHealth intervention was evaluated based on a literature review (n = 2) and patients preferences regarding the intervention (n = 562). Adherence management required support in everyday activities, social networks and maintaining a positive outlook. The factors restricting adherence were related to illness, behavior and the environment. The objective of the intervention was to support the intention to follow the treatment guidelines and recommendations with mHealth technology. The barriers and requirements for the use of the mHealth were related to technology, organizational issues and the users themselves. During the course of the intervention, 33 (6%) out of 562 participants wanted to edit the content, timing or amount of the mHealth tool, and 23 (4%) quit the intervention or study before its conclusion. According to the review, mHealth interventions were ineffective in promoting adherence. Prior to the intervention, participants perceived that adherence could be supported, and the use of mHealth as a part of treatment was seen as an acceptable and efficient method for doing so. In conclusion, the use of mHealth may be feasible among people with psychotic disorders. However, clear evidence for its effectiveness in regards to adherence is still currently inconclusive.