23 resultados para Statistical Convergence
Resumo:
The objective of the thesis is to enhance understanding of the evolution of convergence. Previous research has shown that the technological interfaces between distinct industries are one of the major sources of new radical cross-industry innovations. Despite the fact that convergence in industry evolution has attracted a substantial managerial interest, the conceptual confusion within the field of convergence exists. Firstly, this study clarifies the convergence phenomenon and its impact to industry evolution. Secondly, the study creates novel patent analysis methods to analyze technological convergence and provide tools for anticipating the early stages of convergence. Overall the study combines the industry evolution perspective and the convergence view of industrial evolution. The theoretical background for the study consists of the industry life cycle theories, technology evolution, and technological trajectories. The study links several important concepts in analyzing industry evolution, technological discontinuities, path-dependency, technological interfaces as a source of industry transformation, and the evolutionary stagesof convergence. Based on reviewing the literature a generic understanding of industry transformation and industrial dynamics was generated. In the convergence studies, the theoretical basis is in the discussion of different convergence types and their impacts on industry evolution, and in anticipating and monitoring the stages of convergence. The study is divided in two parts. The first part gives a general overview, and the second part comprises eight research publications. Our case study is based historically on two very distinct industries of the paper and electronics companies as a test environment to evaluate the importance of emerging business sectors and technological convergence as a source of industry transformation. Both qualitative and quantitative research methodology are utilized. The results of this study reveal that technological convergence and complementary innovations from different fields have significant effect to the emerging new business sector formation. The patent-based indicators in the analysis of technological convergence can be utilized on analyzing technology competition, capability and competence development, knowledge accumulation, knowledge spill-overs, and technology-based industry transformation. The patent-based indicators can provide insights to the future competitive environment. Results and conclusions from empirical part seem not be in conflict with real observations in the industry.
Resumo:
The condensation rate has to be high in the safety pressure suppression pool systems of Boiling Water Reactors (BWR) in order to fulfill their safety function. The phenomena due to such a high direct contact condensation (DCC) rate turn out to be very challenging to be analysed either with experiments or numerical simulations. In this thesis, the suppression pool experiments carried out in the POOLEX facility of Lappeenranta University of Technology were simulated. Two different condensation modes were modelled by using the 2-phase CFD codes NEPTUNE CFD and TransAT. The DCC models applied were the typical ones to be used for separated flows in channels, and their applicability to the rapidly condensing flow in the condensation pool context had not been tested earlier. A low Reynolds number case was the first to be simulated. The POOLEX experiment STB-31 was operated near the conditions between the ’quasi-steady oscillatory interface condensation’ mode and the ’condensation within the blowdown pipe’ mode. The condensation models of Lakehal et al. and Coste & Lavi´eville predicted the condensation rate quite accurately, while the other tested ones overestimated it. It was possible to get the direct phase change solution to settle near to the measured values, but a very high resolution of calculation grid was needed. Secondly, a high Reynolds number case corresponding to the ’chugging’ mode was simulated. The POOLEX experiment STB-28 was chosen, because various standard and highspeed video samples of bubbles were recorded during it. In order to extract numerical information from the video material, a pattern recognition procedure was programmed. The bubble size distributions and the frequencies of chugging were calculated with this procedure. With the statistical data of the bubble sizes and temporal data of the bubble/jet appearance, it was possible to compare the condensation rates between the experiment and the CFD simulations. In the chugging simulations, a spherically curvilinear calculation grid at the blowdown pipe exit improved the convergence and decreased the required cell count. The compressible flow solver with complete steam-tables was beneficial for the numerical success of the simulations. The Hughes-Duffey model and, to some extent, the Coste & Lavi´eville model produced realistic chugging behavior. The initial level of the steam/water interface was an important factor to determine the initiation of the chugging. If the interface was initialized with a water level high enough inside the blowdown pipe, the vigorous penetration of a water plug into the pool created a turbulent wake which invoked the chugging that was self-sustaining. A 3D simulation with a suitable DCC model produced qualitatively very realistic shapes of the chugging bubbles and jets. The comparative FFT analysis of the bubble size data and the pool bottom pressure data gave useful information to distinguish the eigenmodes of chugging, bubbling, and pool structure oscillations.
Resumo:
Longitudinal surveys are increasingly used to collect event history data on person-specific processes such as transitions between labour market states. Surveybased event history data pose a number of challenges for statistical analysis. These challenges include survey errors due to sampling, non-response, attrition and measurement. This study deals with non-response, attrition and measurement errors in event history data and the bias caused by them in event history analysis. The study also discusses some choices faced by a researcher using longitudinal survey data for event history analysis and demonstrates their effects. These choices include, whether a design-based or a model-based approach is taken, which subset of data to use and, if a design-based approach is taken, which weights to use. The study takes advantage of the possibility to use combined longitudinal survey register data. The Finnish subset of European Community Household Panel (FI ECHP) survey for waves 1–5 were linked at person-level with longitudinal register data. Unemployment spells were used as study variables of interest. Lastly, a simulation study was conducted in order to assess the statistical properties of the Inverse Probability of Censoring Weighting (IPCW) method in a survey data context. The study shows how combined longitudinal survey register data can be used to analyse and compare the non-response and attrition processes, test the missingness mechanism type and estimate the size of bias due to non-response and attrition. In our empirical analysis, initial non-response turned out to be a more important source of bias than attrition. Reported unemployment spells were subject to seam effects, omissions, and, to a lesser extent, overreporting. The use of proxy interviews tended to cause spell omissions. An often-ignored phenomenon classification error in reported spell outcomes, was also found in the data. Neither the Missing At Random (MAR) assumption about non-response and attrition mechanisms, nor the classical assumptions about measurement errors, turned out to be valid. Both measurement errors in spell durations and spell outcomes were found to cause bias in estimates from event history models. Low measurement accuracy affected the estimates of baseline hazard most. The design-based estimates based on data from respondents to all waves of interest and weighted by the last wave weights displayed the largest bias. Using all the available data, including the spells by attriters until the time of attrition, helped to reduce attrition bias. Lastly, the simulation study showed that the IPCW correction to design weights reduces bias due to dependent censoring in design-based Kaplan-Meier and Cox proportional hazard model estimators. The study discusses implications of the results for survey organisations collecting event history data, researchers using surveys for event history analysis, and researchers who develop methods to correct for non-sampling biases in event history data.
Resumo:
Finansanalytiker har en stor betydelse för finansmarknaderna, speciellt igenom att förmedla information genom resultatprognoser. Typiskt är att analytiker i viss grad är oeniga i sina resultatprognoser, och det är just denna oenighet analytiker emellan som denna avhandling studerar. Då ett företag rapporterar förluster tenderar oenigheten gällande ett företags framtid att öka. På ett intuitivt plan är det lätt att tolka detta som ökad osäkerhet. Det är även detta man finner då man studerar analytikerrapporter - analytiker ser ut att bli mer osäkra då företag börjar gå med förlust, och det är precis då som även oenigheten mellan analytikerna ökar. De matematisk-teoretiska modeller som beskriver analytikers beslutsprocesser har däremot en motsatt konsekvens - en ökad oenighet analytiker emellan kan endast uppkomma ifall analytikerna blir säkrare på ett individuellt plan, där den drivande kraften är asymmetrisk information. Denna avhandling löser motsägelsen mellan ökad säkerhet/osäkerhet som drivkraft bakom spridningen i analytikerprognoser. Genom att beakta mängden publik information som blir tillgänglig via resultatrapporter är det inte möjligt för modellerna för analytikers beslutsprocesser att ge upphov till de nivåer av prognosspridning som kan observeras i data. Slutsatsen blir därmed att de underliggande teoretiska modellerna för prognosspridning är delvis bristande och att spridning i prognoser istället mer troligt följer av en ökad osäkerhet bland analytikerna, i enlighet med vad analytiker de facto nämner i sina rapporter. Resultaten är viktiga eftersom en förståelse av osäkerhet runt t.ex. resultatrapportering bidrar till en allmän förståelse för resultatrapporteringsmiljön som i sin tur är av ytterst stor betydelse för prisbildning på finansmarknader. Vidare används typiskt ökad prognosspridning som en indikation på ökad informationsasymmetri i redovisningsforskning, ett fenomen som denna avhandling därmed ifrågasätter.
Resumo:
This is a sociological study of the views of officers in the Swedish Army and its Amphibious Forces on tactics in Irregular Warfare (IW), in particular, Counterinsurgency (COIN). IW comprises struggles, where the military weaker part uses an indirect approach with smaller units and integrates the civilian and military dimensions in a violence spectrum including subversion, terrorism, Guerrilla Warfare and infantry actions. IW is the main armed warfare style in insurgencies. COIN is the combined political, military, economic, social and legal actions in counter insurgencies. Data has been collected by means of interviews with almost all (n =43) officers, who were either commanding battalions or rifle and manoeuvre companies while undergoing training for general warfare and international operations. The main theoretical and methodological inspiration is the traditional one for research on social fields, inaugurated by the French sociologist Pierre Bourdieu. The statistical technique used is Multiple Correspondence Analysis. As a background and context base, an inquiry inspired by the Begriffsgechichte (Conceptual History) tradition explores the genesis and development of understandings of the term Irregular Warfare. The research question is outlined as; “how can contemporary Swedish military thought on tactics in Irregular Warfare be characterized using descriptive patterns, mapped in relation to background factors and normative standards? The most significant findings are that there are two main opposing notions separating the officers’ views on tactics in Irregular Warfare: (1) a focus on larger, combat oriented and collectively operating military units versus smaller and larger, more intelligence oriented and dispersed operating units, and (2) a focus on military tasks and kinetic effects versus military and civilian tasks as well as “soft” effects. The distribution of these views can be presented as a two-dimensional space structured by the two axes. This space represents four categories of tactics, partly diverging from normative military standards for Counterinsurgency. This social space of standpoints shows different structural tendencies for background factors of social and cultural character, particularly dominant concerning military backgrounds, international mission experiences and civilian education. Compared to military standards for Counterinsurgency, the two tactical types characterized by a Regular Warfare mind-set stands out as counter-normative. Signs of creative thought on military practice and theory, as well as a still persistent Regular Warfare doxa are apparent. Power struggles might thus develop, effecting the transformation to a broadened warfare culture with an enhanced focus also on Irregular Warfare. The result does not support research results arguing for a convergence of military thought in the European transformation of Armed Forces. The main argument goes beyond tactics and suggests sociological analysis on reciprocal effects regarding strategy, operational art, tactics as well as leadership, concerning the mind-set and preferences for Regular, Irregular and Hybrid Warfare.
Resumo:
In this research, the effectiveness of Naive Bayes and Gaussian Mixture Models classifiers on segmenting exudates in retinal images is studied and the results are evaluated with metrics commonly used in medical imaging. Also, a color variation analysis of retinal images is carried out to find how effectively can retinal images be segmented using only the color information of the pixels.
Resumo:
The strongest wish of the customer concerning chemical pulp features is consistent, uniform quality. Variation may be controlled and reduced by using statistical methods. However, studies addressing the application and benefits of statistical methods in forest product sector are scarce. Thus, the customer wish is the root cause of the motivation behind this dissertation. The research problem addressed by this dissertation is that companies in the chemical forest product sector require new knowledge for improving their utilization of statistical methods. To gain this new knowledge, the research problem is studied from five complementary viewpoints – challenges and success factors, organizational learning, problem solving, economic benefit, and statistical methods as management tools. The five research questions generated on the basis of these viewpoints are answered in four research papers, which are case studies based on empirical data collection. This research as a whole complements the literature dealing with the use of statistical methods in the forest products industry. Practical examples of the application of statistical process control, case-based reasoning, the cross-industry standard process for data mining, and performance measurement methods in the context of chemical forest products manufacturing are brought to the public knowledge of the scientific community. The benefit of the application of these methods is estimated or demonstrated. The purpose of this dissertation is to find pragmatic ideas for companies in the chemical forest product sector in order for them to improve their utilization of statistical methods. The main practical implications of this doctoral dissertation can be summarized in four points: 1. It is beneficial to reduce variation in chemical forest product manufacturing processes 2. Statistical tools can be used to reduce this variation 3. Problem-solving in chemical forest product manufacturing processes can be intensified through the use of statistical methods 4. There are certain success factors and challenges that need to be addressed when implementing statistical methods