152 resultados para User Modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tightening competition and increasing dynamism have created an emerging need for flexible asset management. This means that the changes of market demand should be responded to with adjustments in the amount of assets tied to the balance sheets of companies. On the other hand, industrial maintenance has recently experienced drastic changes, which have led to an increase in the number of maintenance networks (consisting of customer companies that buy maintenance services, as well as various supplier companies) and inter-organizational partnerships. However, the research on maintenance networks has not followed the changes in the industry. Instead, there is a growing need for new ways of collaboration between partnering companies to enhance the competitiveness of the whole maintenance network. In addition, it is more and more common for companies to pursue lean operations in their businesses. This thesis shows how flexible asset management can increase the profitability of maintenance companies and networks under dynamic operating conditions, and how the additional value can then be shared between the network partners. Firstly, I have conducted a systematic literature review to identify what kind of requirements for asset management models are set by the increasing dynamism. Then I have responded to these requirements by constructing an analytical model for flexible asset management, linking asset management to the profitability and financial state of a company. The thesis uses the model to show how flexible asset management can increase profitability in maintenance companies and networks, and how the created value can be shared in the networks to reach a win-win situation. The research indicates that the existing models for asset management are heterogeneous by nature due to the various definitions of ‘asset management’. I conclude that there is a need for practical asset management models which address assets comprehensively with an inter-organizational, strategic view. The comprehensive perspective, taking all kinds of asset types into account, is needed to integrate the research on asset management with the strategic management of companies and networks. I will show that maintenance companies can improve their profitability by increasing the flexibility of their assets. In maintenance networks, reorganizing the ownership of the assets among the different network partners can create additional value. Finally, I will introduce flexible asset management contracts for maintenance networks. These contracts address the value sharing related to reorganizing the ownership of assets according to the principles of win-win situations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study is to investigate whether there exists any kind of relationship between the spot and future prices of the different commodities or not. Commodities like cocoa, coffee, crude oil, gold, natural gas and silver are considered from January 3, 2000 to December 31, 2012. For this purpose, ADF test and KPSS test are used in testing the stationarity whereas Johansen Cointegration test is used in testing the long-run relationship. Johansen co-integration test exhibits that there at least 5 co-integrating pairs out of 6 except crude oil. Moreover, the result of Granger Causality supports the fact that if two or more than two time series tend to be co-integrated there exists either uni-directional or bi-directional relationship. However, our results reveled that although there exists the co-integration between the variable, one might not granger causes another .VAR model is also used to measure the proportion of effects. These findings will help the derivative market and arbitragers in developing the strategies to gain the maximum profit in the financial market.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the beginning of its 10th year of existence Facebook has engaged and connected 1.2 billion monthly active users. This article-based dissertation Disconnect.Me – User Engagement and Facebook approaches this engagement from the opposite direction: disconnection. The research articles focus on social media specific phenomena including leaving Facebook, tactical media works such as Web 2.0 SuicideMachine, memorializing dead Facebook users and Facebook trolling. The media theoretical framework for this study is built around affect theory, software studies, biopolitics as well as different critical studies of new media. The argument is that disconnection is a necessary condition of social media connectivity and exploring social media through disconnection – as an empirical phenomenon, future potential and theoretical notion – helps us to understand how users are engaged with social media, its uses and subsequent business models. The results of the study indicate that engagement is a relation that precedes user participation, a notion often used to conceptualize social media. Furthermore, this engagement turns the focus from users’ actions towards the platform and how the platform actively controls users and their behavior. Facebook aims to engage new users and maintain the old ones by renewing its platform and user interface. User engagement with the platform is thus social but also technical and affective. When engaged, the user is positioned to algorithmic connectivity where machinc processes mine user data. This data is but sold also used to affect and engage other users. In the heart of this study is the notion that our networked engagements matter and disconnection can bring us to the current limits of network culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrogen stratification and atmosphere mixing is a very important phenomenon in nuclear reactor containments when severe accidents are studied and simulated. Hydrogen generation, distribution and accumulation in certain parts of containment may pose a great risk to pressure increase induced by hydrogen combustion, and thus, challenge the integrity of NPP containment. The accurate prediction of hydrogen distribution is important with respect to the safety design of a NPP. Modelling methods typically used for containment analyses include both lumped parameter and field codes. The lumped parameter method is universally used in the containment codes, because its versatility, flexibility and simplicity. The lumped parameter method allows fast, full-scale simulations, where different containment geometries with relevant engineering safety features can be modelled. Lumped parameter gas stratification and mixing modelling methods are presented and discussed in this master’s thesis. Experimental research is widely used in containment analyses. The HM-2 experiment related to hydrogen stratification and mixing conducted at the THAI facility in Germany is calculated with the APROS lump parameter containment package and the APROS 6-equation thermal hydraulic model. The main purpose was to study, whether the convection term included in the momentum conservation equation of the 6-equation modelling gives some remarkable advantages compared to the simplified lumped parameter approach. Finally, a simple containment test case (high steam release to a narrow steam generator room inside a large dry containment) was calculated with both APROS models. In this case, the aim was to determine the extreme containment conditions, where the effect of convection term was supposed to be possibly high. Calculation results showed that both the APROS containment and the 6-equation model could model the hydrogen stratification in the THAI test well, if the vertical nodalisation was dense enough. However, in more complicated cases, the numerical diffusion may distort the results. Calculation of light gas stratification could be probably improved by applying the second order discretisation scheme for the modelling of gas flows. If the gas flows are relatively high, the convection term of the momentum equation is necessary to model the pressure differences between the adjacent nodes reasonably.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing user interfaces for novel software systems can be challenging since the usability preferences of the users are not well known. This thesis presents a usability study conducted for the development of a user interface for game developers to enter game specific information. By conducting usability testing, the usability preferences of game developers were explored and the design was shaped according to their needs. An assessment of the overall usability of the final design is provided together with the main findings that include the usability preferences and design recommendations. The results showed that the most valuable usability preferences are quickness, error tolerance and the ability to constantly inspect the entered information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

End-user development is a very common but often largely overlooked phenomenon in information systems research and practice. End-user development means that regular people, the end-users of software, and not professional developers are doing software development. A large number of people are directly or indirectly impacted by the results of these non-professional development activities. The numbers of users performing end-user development activities are difficult to ascertain precisely. But it is very large, and still growing. Computer adoption is growing towards 100% and many new types of computational devices are continually introduced. In addition, other devices not previously programmable are becoming so. This means that, at this very moment, hundreds of millions of people are likely struggling with development problems. Furthermore, software itself is continually being adapted for more flexibility, enabling users to change the behaviour of their software themselves. New software and services are helping to transform users from consumers to producers. Much of this is now found on-line. The problem for the end-user developer is that little of this development is supported by anyone. Often organisations do not notice end-user development and consequently neither provide support for it, nor are equipped to be able to do so. Many end-user developers do not belong to any organisation at all. Also, the end-user development process may be aggravating the problem. End-users are usually not really committed to the development process, which tends to be more iterative and ad hoc. This means support becomes a distant third behind getting the job done and figuring out the development issues to get the job done. Sometimes the software itself may exacerbate the issue by simplifying the development process, deemphasising the difficulty of the task being undertaken. On-line support could be the lifeline the end-user developer needs. Going online one can find all the knowledge one could ever need. However, that does still not help the end-user apply this information or knowledge in practice. A virtual community, through its ability to adopt the end-user’s specific context, could surmount this final obstacle. This thesis explores the concept of end-user development and how it could be supported through on-line sources, in particular virtual communities, which it is argued here, seem to fit the end-user developer’s needs very well. The experiences of real end-user developers and prior literature were used in this process. Emphasis has been on those end-user developers, e.g. small business owners, who may have literally nowhere to turn to for support. Adopting the viewpoint of the end-user developer, the thesis examines the question of how an end-user could use a virtual community effectively, improving the results of the support process. Assuming the common situation where the demand for support outstrips the supply.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Can crowdsourcing solutions serve many masters? Can they be beneficial for both, for the layman or native speakers of minority languages on the one hand and serious linguistic research on the other? How did an infrastructure that was designed to support linguistics turn out to be a solution for raising awareness of native languages? Since 2012 the National Library of Finland has been developing the Digitisation Project for Kindred Languages, in which the key objective is to support a culture of openness and interaction in linguistic research, but also to promote crowdsourcing as a tool for participation of the language community in research. In the course of the project, over 1,200 monographs and nearly 111,000 pages of newspapers in Finno-Ugric languages will be digitised and made available in the Fenno-Ugrica digital collection. This material was published in the Soviet Union in the 1920s and 1930s, and users have had only sporadic access to the material. The publication of open-access and searchable materials from this period is a goldmine for researchers. Historians, social scientists and laymen with an interest in specific local publications can now find text materials pertinent to their studies. The linguistically-oriented population can also find writings to delight them: (1) lexical items specific to a given publication, and (2) orthographically-documented specifics of phonetics. In addition to the open access collection, we developed an open source code OCR editor that enables the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary since these rare and peripheral prints often include already archaic characters, which are neglected by modern OCR software developers but belong to the historical context of kindred languages, and are thus an essential part of the linguistic heritage. When modelling the OCR editor, it was essential to consider both the needs of researchers and the capabilities of lay citizens, and to have them participate in the planning and execution of the project from the very beginning. By implementing the feedback iteratively from both groups, it was possible to transform the requested changes as tools for research that not only supported the work of linguistics but also encouraged the citizen scientists to face the challenge and work with the crowdsourcing tools for the benefit of research. This presentation will not only deal with the technical aspects, developments and achievements of the infrastructure but will highlight the way in which user groups, researchers and lay citizens were engaged in a process as an active and communicative group of users and how their contributions were made to mutual benefit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the doctoral dissertation, low-voltage direct current (LVDC) distribution system stability, supply security and power quality are evaluated by computational modelling and measurements on an LVDC research platform. Computational models for the LVDC network analysis are developed. Time-domain simulation models are implemented in the time-domain simulation environment PSCAD/EMTDC. The PSCAD/EMTDC models of the LVDC network are applied to the transient behaviour and power quality studies. The LVDC network power loss model is developed in a MATLAB environment and is capable of fast estimation of the network and component power losses. The model integrates analytical equations that describe the power loss mechanism of the network components with power flow calculations. For an LVDC network research platform, a monitoring and control software solution is developed. The solution is used to deliver measurement data for verification of the developed models and analysis of the modelling results. In the work, the power loss mechanism of the LVDC network components and its main dependencies are described. Energy loss distribution of the LVDC network components is presented. Power quality measurements and current spectra are provided and harmonic pollution on the DC network is analysed. The transient behaviour of the network is verified through time-domain simulations. DC capacitor guidelines for an LVDC power distribution network are introduced. The power loss analysis results show that one of the main optimisation targets for an LVDC power distribution network should be reduction of the no-load losses and efficiency improvement of converters at partial loads. Low-frequency spectra of the network voltages and currents are shown, and harmonic propagation is analysed. Power quality in the LVDC network point of common coupling (PCC) is discussed. Power quality standard requirements are shown to be met by the LVDC network. The network behaviour during transients is analysed by time-domain simulations. The network is shown to be transient stable during large-scale disturbances. Measurement results on the LVDC research platform proving this are presented in the work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esitys KDK-käytettävyystyöryhmän järjestämässä seminaarissa: Miten käyttäjien toiveet haastavat metatietokäytäntöjämme? / How users' expectations challenge our metadata practices? 30.9.2014.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Esitys KDK-käytettävyystyöryhmän järjestämässä seminaarissa: Miten käyttäjien toiveet haastavat metatietokäytäntöjämme? / How users' expectations challenge our metadata practices? 30.9.2014.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this work is to obtain a better understanding of behaviour of possible ultrasound appliance on fluid media mixing. The research is done in the regard to Newtonian and non-Newtonian fluids. The process of ultrasound appliance on liquids is modelled in COMSOL Multiphysics software. The influence of ultrasound using is introduced as waveform equation. Turbulence modelling is fulfilled by the k-ε model in Newtonian fluid. The modeling of ultrasound assisted mixing in non-Newtonian fluids is based on the power law. To verify modelling results two practical methods are used: Particle Image Velocimetry and measurements of mixing time. Particle Image Velocimetry allows capturing of velocity flow field continuously and presents detailed depiction of liquid dynamics. The second way of verification is the comparison of mixing time of homogeneity. Experimentally achievement of mixing time is done by conductivity measurements. In modelling part mixing time is achieved by special module of COMSOL Multiphysics – the transport of diluted species. Both practical and modelling parts show similar radial mechanism of fluid flow under ultrasound appliance – from the horn tip fluid moves to the bottom and along the walls goes back. Velocity profiles are similar in modelling and experimental part in the case of Newtonian fluid. In the case of non-Newtonian fluid velocity profiles do not agree. The development track of ultrasound-assisted mixing modelling is presented in the thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Innovative gas cooled reactors, such as the pebble bed reactor (PBR) and the gas cooled fast reactor (GFR) offer higher efficiency and new application areas for nuclear energy. Numerical methods were applied and developed to analyse the specific features of these reactor types with fully three dimensional calculation models. In the first part of this thesis, discrete element method (DEM) was used for a physically realistic modelling of the packing of fuel pebbles in PBR geometries and methods were developed for utilising the DEM results in subsequent reactor physics and thermal-hydraulics calculations. In the second part, the flow and heat transfer for a single gas cooled fuel rod of a GFR were investigated with computational fluid dynamics (CFD) methods. An in-house DEM implementation was validated and used for packing simulations, in which the effect of several parameters on the resulting average packing density was investigated. The restitution coefficient was found out to have the most significant effect. The results can be utilised in further work to obtain a pebble bed with a specific packing density. The packing structures of selected pebble beds were also analysed in detail and local variations in the packing density were observed, which should be taken into account especially in the reactor core thermal-hydraulic analyses. Two open source DEM codes were used to produce stochastic pebble bed configurations to add realism and improve the accuracy of criticality calculations performed with the Monte Carlo reactor physics code Serpent. Russian ASTRA criticality experiments were calculated. Pebble beds corresponding to the experimental specifications within measurement uncertainties were produced in DEM simulations and successfully exported into the subsequent reactor physics analysis. With the developed approach, two typical issues in Monte Carlo reactor physics calculations of pebble bed geometries were avoided. A novel method was developed and implemented as a MATLAB code to calculate porosities in the cells of a CFD calculation mesh constructed over a pebble bed obtained from DEM simulations. The code was further developed to distribute power and temperature data accurately between discrete based reactor physics and continuum based thermal-hydraulics models to enable coupled reactor core calculations. The developed method was also found useful for analysing sphere packings in general. CFD calculations were performed to investigate the pressure losses and heat transfer in three dimensional air cooled smooth and rib roughened rod geometries, housed inside a hexagonal flow channel representing a sub-channel of a single fuel rod of a GFR. The CFD geometry represented the test section of the L-STAR experimental facility at Karlsruhe Institute of Technology and the calculation results were compared to the corresponding experimental results. Knowledge was gained of the adequacy of various turbulence models and of the modelling requirements and issues related to the specific application. The obtained pressure loss results were in a relatively good agreement with the experimental data. Heat transfer in the smooth rod geometry was somewhat under predicted, which can partly be explained by unaccounted heat losses and uncertainties. In the rib roughened geometry heat transfer was severely under predicted by the used realisable k − epsilon turbulence model. An additional calculation with a v2 − f turbulence model showed significant improvement in the heat transfer results, which is most likely due to the better performance of the model in separated flow problems. Further investigations are suggested before using CFD to make conclusions of the heat transfer performance of rib roughened GFR fuel rod geometries. It is suggested that the viewpoints of numerical modelling are included in the planning of experiments to ease the challenging model construction and simulations and to avoid introducing additional sources of uncertainties. To facilitate the use of advanced calculation approaches, multi-physical aspects in experiments should also be considered and documented in a reasonable detail.