74 resultados para large-angle stereo-projection
Resumo:
This thesis contains dynamical analysis on four different scales: the Solar system, the Sun itself, the Solar neighbourhood, and the central region of the Milky Way galaxy. All of these topics have been handled through methods of potential theory and statistics. The central topic of the thesis is the orbits of stars in the Milky Way. An introduction into the general structure of the Milky Way is presented, with an emphasis on the evolution of the observed value for the scale-length of the Milky Way disc and the observations of two separate bars in the Milky Way. The basics of potential theory are also presented, as well as a developed potential model for the Milky Way. An implementation of the backwards restricted integration method is shown, rounding off the basic principles used in the dynamical studies of this thesis. The thesis looks at the orbit of the Sun, and its impact on the Oort cloud comets (Paper IV), showing that there is a clear link between these two dynamical systems. The statistical atypicalness of the orbit of the Sun is questioned (Paper I), concluding that there is some statistical typicalness to the orbit of the Sun, although it is not very significant. This does depend slightly on whether one includes a bar, or not, as a bar has a clear effect on the dynamical features seen in the Solar neighbourhood (Paper III). This method can be used to find the possible properties of a bar. Finally, we look at the effect of a bar on a statistical system in the Milky Way, seeing that there are not only interesting effects depending on the mass and size of the bar, but also how bars can capture disc stars (Paper II).
Resumo:
Transitional flow past a three-dimensional circular cylinder is a widely studied phenomenon since this problem is of interest with respect to many technical applications. In the present work, the numerical simulation of flow past a circular cylinder, performed by using a commercial CFD code (ANSYS Fluent 12.1) with large eddy simulation (LES) and RANS (κ - ε and Shear-Stress Transport (SST) κ - ω! model) approaches. The turbulent flow for ReD = 1000 & 3900 is simulated to investigate the force coefficient, Strouhal number, flow separation angle, pressure distribution on cylinder and the complex three dimensional vortex shedding of the cylinder wake region. The numerical results extracted from these simulations have good agreement with the experimental data (Zdravkovich, 1997). Moreover, grid refinement and time-step influence have been examined. Numerical calculations of turbulent cross-flow in a staggered tube bundle continues to attract interest due to its importance in the engineering application as well as the fact that this complex flow represents a challenging problem for CFD. In the present work a time dependent simulation using κ – ε, κ - ω! and SST models are performed in two dimensional for a subcritical flow through a staggered tube bundle. The predicted turbulence statistics (mean and r.m.s velocities) have good agreement with the experimental data (S. Balabani, 1996). Turbulent quantities such as turbulent kinetic energy and dissipation rate are predicted using RANS models and compared with each other. The sensitivity of grid and time-step size have been analyzed. Model constants sensitivity study have been carried out by adopting κ – ε model. It has been observed that model constants are very sensitive to turbulence statistics and turbulent quantities.
Resumo:
Credit risk assessment is an integral part of banking. Credit risk means that the return will not materialise in case the customer fails to fulfil its obligations. Thus a key component of banking is setting acceptance criteria for granting loans. Theoretical part of the study focuses on key components of credit assessment methods of Banks in the literature when extending credits to large corporations. Main component is Basel II Accord, which sets regulatory requirement for credit risk assessment methods of banks. Empirical part comprises, as primary source, analysis of major Nordic banks’ annual reports and risk management reports. As secondary source complimentary interviews were carried out with senior credit risk assessment personnel. The findings indicate that all major Nordic banks are using combination of quantitative and qualitative information in credit risk assessment model when extending credits to large corporations. The relative input of qualitative information depends on the selected approach to the credit rating, i.e. point-in-time or through-the-cycle.
Resumo:
The main objective of this research is creating a performance measurement system for accounting services of a large paper industry company. In this thesis there are compared different performance measurement system and then selected two systems, which are presented and compared more detailed. Performance Prism system is the used framework in this research. Performance Prism using success maps to determining objectives. Model‟s target areas are divided into five groups: stakeholder satisfaction, stakeholder contribution, strategy, processes and capabilities. The measurement system creation began by identifying stakeholders and defining their objectives. Based on the objectives are created success map. Measures are created based on the objectives and success map. Then is defined needed data for measures. In the final measurement system, there are total just over 40 measures. Each measure is defined specific target level and ownership. Number of measures is fairly large, but this is the first version of the measurement system, so the amount is acceptable.
Resumo:
IT Service Management plays a key role in many IT organizations today. First IT Service Management principles founded in the early 1980s but the real adaption emerged in the end 2000s. IT Financial Management is one of IT Service Management’s processes. The main purpose of this thesis was study how IT Financial Management approach can be improved in a case company. Budgeting, accounting and charging are IT Financial Management functions. These functions are researched in this thesis. Thesis materials consist of both qualitative and quantitative material. The theoretical part consists mostly of IT Service Management literature while interviews and the case company’s information systems are researched in the empirical part. Thesis also reviews different kind of the systems which supports and automates IT Financial Management functions. The biggest challenge is the cost allocation with the current ERP system in the case company. It is worth to take group based system for allocation in use before there is a holistic system in a market. The case company should also develop its IT service processes forward.
Resumo:
Visual data mining (VDM) tools employ information visualization techniques in order to represent large amounts of high-dimensional data graphically and to involve the user in exploring data at different levels of detail. The users are looking for outliers, patterns and models – in the form of clusters, classes, trends, and relationships – in different categories of data, i.e., financial, business information, etc. The focus of this thesis is the evaluation of multidimensional visualization techniques, especially from the business user’s perspective. We address three research problems. The first problem is the evaluation of projection-based visualizations with respect to their effectiveness in preserving the original distances between data points and the clustering structure of the data. In this respect, we propose the use of existing clustering validity measures. We illustrate their usefulness in evaluating five visualization techniques: Principal Components Analysis (PCA), Sammon’s Mapping, Self-Organizing Map (SOM), Radial Coordinate Visualization and Star Coordinates. The second problem is concerned with evaluating different visualization techniques as to their effectiveness in visual data mining of business data. For this purpose, we propose an inquiry evaluation technique and conduct the evaluation of nine visualization techniques. The visualizations under evaluation are Multiple Line Graphs, Permutation Matrix, Survey Plot, Scatter Plot Matrix, Parallel Coordinates, Treemap, PCA, Sammon’s Mapping and the SOM. The third problem is the evaluation of quality of use of VDM tools. We provide a conceptual framework for evaluating the quality of use of VDM tools and apply it to the evaluation of the SOM. In the evaluation, we use an inquiry technique for which we developed a questionnaire based on the proposed framework. The contributions of the thesis consist of three new evaluation techniques and the results obtained by applying these evaluation techniques. The thesis provides a systematic approach to evaluation of various visualization techniques. In this respect, first, we performed and described the evaluations in a systematic way, highlighting the evaluation activities, and their inputs and outputs. Secondly, we integrated the evaluation studies in the broad framework of usability evaluation. The results of the evaluations are intended to help developers and researchers of visualization systems to select appropriate visualization techniques in specific situations. The results of the evaluations also contribute to the understanding of the strengths and limitations of the visualization techniques evaluated and further to the improvement of these techniques.
Resumo:
The objective of this research was to describe how Nordic companies manage hazard risks in their operations in Russia and how the local business environment is considered to affect the hazard risks. Research methods used in this research were literature review and expert interviews. Twelve Nordic industrial companies operating in different fields of industry were interviewed. Large Nordic companies typically guide risk management centralized from the parent company on behalf of the whole company group and the risk management standards and policies are integrated in all subsidiaries. Parent companies typically control hazard risk management in Russia by regular risk management reporting, auditing the Russian sites and by training local managers and employees to risk management work. Many companies have experienced several losses in the first years of operating in Russia before the risk management policies have been implemented in Russian subsidiaries. The companies have learned to take local characteristics better into account by experience and most companies are quite satisfied with their current risk management standards in Russia. The interviews indicate that companies experience especially the poor quality of infrastructure, some features in Russian organizational culture and high level of criminality to increase hazard risks in Russia. However, understanding these features and risks in the business environment makes the management of these risks possible. Risks related to infrastructure can be managed in advance by decreasing dependencies of infrastructure and considering the infrastructure quality already when planning the business operations. Also good local network is often considered critical in order to overcome the complications related to infrastructure. Russian personnel has typically different attitude towards risk management than Nordic personnel and neglecting safety and maintenance and concealing losses is more typical in Russia. By training and guiding the local personnel risk management and safety work and desired ways of actions these risks can be decreased. Criminality risks are often managed to certain extent by investing in security, increasing supervising and paying attention to reliability of the employees and other interest groups of the company.
Resumo:
Planar, large area, position sensitive silicon detectors are widely utilized in high energy physics research and in medical, computed tomography (CT). This thesis describes author's research work relating to development of such detector components. The key motivation and objective for the research work has been the development of novel, position sensitive detectors improving the performance of the instruments they are intended for. Silicon strip detectors are the key components of barrel-shaped tracking instruments which are typically the innermost structures of high energy physics experimental stations. Particle colliders such as the former LEP collider or present LHC produce particle collisions and the silicon strip detector based trackers locate the trajectories of particles emanating from such collisions. Medical CT has become a regular part of everyday medical care in all developed countries. CT scanning enables x-ray imaging of all parts of the human body with an outstanding structural resolution and contrast. Brain, chest and abdomen slice images with a resolution of 0.5 mm are possible and latest CT machines are able to image whole human heart between heart beats. The two application areas are presented shortly and the radiation detection properties of planar silicon detectors are discussed. Fabrication methods and preamplifier electronics of the planar detectors are presented. Designs of the developed, large area silicon detectors are presented and measurement results of the key operating parameters are discussed. Static and dynamic performance of the developed silicon strip detectors are shown to be very satisfactory for experimental physics applications. Results relating to the developed, novel CT detector chips are found to be very promising for further development and all key performance goals are met.
Resumo:
Social media is a multidimensional marketing and communications channel which can support and enhance a business’ reputation, sales and even longevity. Social media as a business tool encourages an interaction between customers and companies which gives opportunities for a company to better understand their customers, to target them more effectively and to collaborate and create dialogues with them which is not possible through traditional media channels. The aim of a social media strategy is to increase brand awareness, image, loyalty and recognition. The peer networks that social media creates allows a company to disseminate information through loyal customers to new and prospective customers to ultimately increase reach. The purpose of the study is to understand the marketer’s perspective of social media marketing use and how it is currently utilized in marketing and communications activities in Finland. Three companies were interviewed covering fourteen different implementations of social media marketing campaigns. These were then analysed to ascertain the utilization methods and experience gained on recent campaigns in the Finnish market The utilization of social media marketing was analysed using the methods of thematic analysis and inductive and abductive reasoning. Elements and themes were drawn out of the separate interviews to create a framework with which to explore, evaluate and match theories that define social media usage by companies. It became clear from all of the interviews that social media as a tool is most effective when it captures the viewer’s interest through rich and entertaining content. This directed the theoretical research towards Engagement Theory and Content Marketing which look to emphasize the importance of communities, collaboration, interaction, and peer-sharing as the key drivers of a social media marketing campaign.
Resumo:
The recent digitization, fragmentation of the media landscape and consumers’ changing media behavior are all changes that have had drastic effects on creating marketing communications. In order to create effective marketing communications large advertisers are now co-operating with a variety of marketing communications companies. The purpose of the study is to understand how advertisers perceive these different companies and more importantly how do advertisers expect their roles to change in the future as the media landscape continues to evolve. Especially the changing roles of advertising agencies and media agencies are examined as they are at the moment the most relevant partners of the advertisers. However, the research is conducted from a network perspective rather than focusing on single actors of the marketing communications industry network. The research was conducted using a qualitative theme interview method. The empirical data was gathered by interviewing representatives from nine of the 50 largest Finnish advertisers measured by media spending. Thus, the research was conducted solely from large B2C advertisers’ perspective while the views of their other relevant actors of the network were left unexplored. The interviewees were chosen with a focus on variety of points of view. The analytical framework that was used to analyze the gathered data was built the IMP group’s industrial network model that consists of actors, their resources and activities. As technology driven media landscape fragmentation and consumers’ changing media behavior continue to increase the complexity of creating marketing communications, advertisers are going to need to rely on a growing number of partnerships as they see that the current actors of the network will not be able to widen their expertise to answer to these new needs. The advertisers expect to form new partnerships with actors that are more specialized and able to react and produce activities more quickly than at the moment. Thus, new smaller and more agile actors with looser structures are going to appear to fill these new needs. Therefore, the need of co-operation between the actors is going to become more important. These changes pose the biggest threat for traditional advertising agencies as they were seen as being most unable to cope with the ongoing change. Media agencies are in a more favorable position for remaining relevant for the advertisers as they will be able to justify their activities and provided value by leveraging their data handling abilities. In general the advertisers expect to be working with a limited number of close actors and in addition having a network of smaller actors, which are used on a more ad hoc basis.
Resumo:
Currently, a high penetration level of Distributed Generations (DGs) has been observed in the Danish distribution systems, and even more DGs are foreseen to be present in the upcoming years. How to utilize them for maintaining the security of the power supply under the emergency situations, has been of great interest for study. This master project is intended to develop a control architecture for studying purposes of distribution systems with large scale integration of solar power. As part of the EcoGrid EU Smart Grid project, it focuses on the system modelling and simulation of a Danish representative LV network located in Bornholm island. Regarding the control architecture, two types of reactive control techniques are implemented and compare. In addition, a network voltage control based on a tap changer transformer is tested. The optimized results after applying a genetic algorithm to five typical Danish domestic loads are lower power losses and voltage deviation using Q(U) control, specially with large consumptions. Finally, a communication and information exchange system is developed with the objective of regulating the reactive power and thereby, the network voltage remotely and real-time. Validation test of the simulated parameters are performed as well.
Resumo:
Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.
Resumo:
In this work, image based estimation methods, also known as direct methods, are studied which avoid feature extraction and matching completely. Cost functions use raw pixels as measurements and the goal is to produce precise 3D pose and structure estimates. The cost functions presented minimize the sensor error, because measurements are not transformed or modified. In photometric camera pose estimation, 3D rotation and translation parameters are estimated by minimizing a sequence of image based cost functions, which are non-linear due to perspective projection and lens distortion. In image based structure refinement, on the other hand, 3D structure is refined using a number of additional views and an image based cost metric. Image based estimation methods are particularly useful in conditions where the Lambertian assumption holds, and the 3D points have constant color despite viewing angle. The goal is to improve image based estimation methods, and to produce computationally efficient methods which can be accomodated into real-time applications. The developed image-based 3D pose and structure estimation methods are finally demonstrated in practise in indoor 3D reconstruction use, and in a live augmented reality application.
Resumo:
Global illumination algorithms are at the center of realistic image synthesis and account for non-trivial light transport and occlusion within scenes, such as indirect illumination, ambient occlusion, and environment lighting. Their computationally most difficult part is determining light source visibility at each visible scene point. Height fields, on the other hand, constitute an important special case of geometry and are mainly used to describe certain types of objects such as terrains and to map detailed geometry onto object surfaces. The geometry of an entire scene can also be approximated by treating the distance values of its camera projection as a screen-space height field. In order to shadow height fields from environment lights a horizon map is usually used to occlude incident light. We reduce the per-receiver time complexity of generating the horizon map on N N height fields from O(N) of the previous work to O(1) by using an algorithm that incrementally traverses the height field and reuses the information already gathered along the path of traversal. We also propose an accurate method to integrate the incident light within the limits given by the horizon map. Indirect illumination in height fields requires information about which other points are visible to each height field point. We present an algorithm to determine this intervisibility in a time complexity that matches the space complexity of the produced visibility information, which is in contrast to previous methods which scale in the height field size. As a result the amount of computation is reduced by two orders of magnitude in common use cases. Screen-space ambient obscurance methods approximate ambient obscurance from the depth bu er geometry and have been widely adopted by contemporary real-time applications. They work by sampling the screen-space geometry around each receiver point but have been previously limited to near- field effects because sampling a large radius quickly exceeds the render time budget. We present an algorithm that reduces the quadratic per-pixel complexity of previous methods to a linear complexity by line sweeping over the depth bu er and maintaining an internal representation of the processed geometry from which occluders can be efficiently queried. Another algorithm is presented to determine ambient obscurance from the entire depth bu er at each screen pixel. The algorithm scans the depth bu er in a quick pre-pass and locates important features in it, which are then used to evaluate the ambient obscurance integral accurately. We also propose an evaluation of the integral such that results within a few percent of the ray traced screen-space reference are obtained at real-time render times.