23 resultados para the ‘Modern’ Professional


Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the market where companies similar in size and resources are competing, it is challenging to have any advantage over others. In order to stay afloat company needs to have capability to perform with fewer resources and yet provide better service. Hence development of efficient processes which can cut costs and improve performance is crucial. As business expands, processes become complicated and large amount of data needs to be managed and available on request. Different tools are used in companies to store and manage data, which facilitates better production and transactions. In the modern business world the most utilized tool for that purpose is ERP - Enterprise Resource Planning system. The focus of this research is to study how competitive advantage can be achieved by implementing proprietary ERP system in the company; ERP system that is in-house created, tailor made to match and align business needs and processes. Market is full of ERP software, but choosing the right one is a big challenge. Identifying the key features that need improvement in processes and data management, choosing the right ERP, implementing it and the follow-up is a long and expensive journey companies undergo. Some companies prefer to invest in a ready-made package bought from vendor and adjust it according to own business needs, while others focus on creating own system with in-house IT capabilities. In this research a case company is used and author tries to identify and analyze why organization in question decided to pursue the development of proprietary ERP system, how it has been implemented and whether it has been successful. Main conclusion and recommendation of this research is for companies to know core capabilities and constraints before choosing and implementing ERP system. Knowledge of factors that affect system change outcome is important, to make the right decisions on strategic level and implement on operational level. Duration of the project in the case company has lasted longer than anticipated. It has been reported that in cases of buying ready product from vendor, projects are delayed and completed over budget as well. In general, in case company implementation of proprietary ERP has been successful both from business performance figures and usability of system by employees. In terms of future research, conducting a study to calculate statistically ROI of both approaches; of buying ready product and creating own ERP will be beneficial.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This doctoral dissertation explores the intra-organizational dynamics of a strategic renewal process. The main research question is how the pursuit of change and organizational inertia co-exist, intertwine, and collide in organizational cognition and capabilities during the strategic renewal. It is a comprehensive study on how organizational capabilities, organizational cognition, and structure enhance and inhibit change. Theoretically, the study is positioned in the modern tradition of strategy research, using the dynamic capability view and the organizational and managerial cognition research tradition as the main theoretical frames. Empirically, the study is a longitudinal case study of the Finnish Broadcasting Company (Yle), following the organizational changes during the years of 2011-1014. The analysis is based on both quantitative and qualitative data, which was collected during the research process using surveys, interviews, and archives. The main theoretical contribution is the application of the two theoretical approaches in one study. Empirically, the study contributes to operationalization of the concepts related to the dynamic capability view and organizational cognition, in a media context that is going through drastic changes due to digitalization. Furthermore, the case of a public broadcasting company extends the application of the theoretical concepts to the context of public management. The results suggest that renewal is a complex process, in which an organization’s perceptions intertwine with the strategic actions and decision-making. The change evolves pathdependently: the past experiences, routines, and organizational structures tend to dictate the future visions, desires, and actions. The study also reveals how the public nature of an organization adds to the tensions between change and organizational inertia, and hampers the decision-making. The doctoral dissertation consists of six research papers, each of which explores the phenomenon under study from a different perspective.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Master’s thesis Biomass Utilization in PFC Co-firing System with the Slagging and Fouling Analysis is the study of the modern technologies of different coal-firing systems: PFC system, FB system and GF system. The biomass co-fired with coal is represented by the research of the company Alstom Power Plant. Based on the back ground of the air pollution, greenhouse effect problems and the national fuel security today, the bioenergy utilization is more and more popular. However, the biomass is promoted to burn to decrease the emission amount of carbon dioxide and other air pollutions, new problems form like slagging and fouling, hot corrosion in the firing systems. Thesis represent the brief overview of different coal-firing systems utilized in the world, and focus on the biomass-coal co-firing in the PFC system. The biomass supply and how the PFC system is running are represented in the thesis. Additionally, the new problems of hot corrosion, slagging and fouling are mentioned. The slagging and fouling problem is simulated by using the software HSC Chemistry 6.1, and the emissions comparison between coal-firing and co-firing are simulated as well.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis discusses the basic problem of the modern portfolio theory about how to optimise the perfect allocation for an investment portfolio. The theory provides a solution for an efficient portfolio, which minimises the risk of the portfolio with respect to the expected return. A central feature for all the portfolios on the efficient frontier is that the investor needs to provide the expected return for each asset. Market anomalies are persistent patterns seen in the financial markets, which cannot be explained with the current asset pricing theory. The goal of this thesis is to study whether these anomalies can be observed among different asset classes. Finally, if persistent patterns are found, it is investigated whether the anomalies hold valuable information for determining the expected returns used in the portfolio optimization Market anomalies and investment strategies based on them are studied with a rolling estimation window, where the return for the following period is always based on historical information. This is also crucial when rebalancing the portfolio. The anomalies investigated within this thesis are value, momentum, reversal, and idiosyncratic volatility. The research data includes price series of country level stock indices, government bonds, currencies, and commodities. The modern portfolio theory and the views given by the anomalies are combined by utilising the Black-Litterman model. This makes it possible to optimise the portfolio so that investor’s views are taken into account. When constructing the portfolios, the goal is to maximise the Sharpe ratio. Significance of the results is studied by assessing if the strategy yields excess returns in a relation to those explained by the threefactormodel. The most outstanding finding is that anomaly based factors include valuable information to enhance efficient portfolio diversification. When the highest Sharpe ratios for each asset class are picked from the test factors and applied to the Black−Litterman model, the final portfolio results in superior riskreturn combination. The highest Sharpe ratios are provided by momentum strategy for stocks and long-term reversal for the rest of the asset classes. Additionally, a strategy based on the value effect was highly appealing, and it basically performs as well as the previously mentioned Sharpe strategy. When studying the anomalies, it is found, that 12-month momentum is the strongest effect, especially for stock indices. In addition, a high idiosyncratic volatility seems to be positively correlated with country indices on stocks.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Intelligence from a human source, that is falsely thought to be true, is potentially more harmful than a total lack of it. The veracity assessment of the gathered intelligence is one of the most important phases of the intelligence process. Lie detection and veracity assessment methods have been studied widely but a comprehensive analysis of these methods’ applicability is lacking. There are some problems related to the efficacy of lie detection and veracity assessment. According to a conventional belief an almighty lie detection method, that is almost 100% accurate and suitable for any social encounter, exists. However, scientific studies have shown that this is not the case, and popular approaches are often over simplified. The main research question of this study was: What is the applicability of veracity assessment methods, which are reliable and are based on scientific proof, in terms of the following criteria? o Accuracy, i.e. probability of detecting deception successfully o Ease of Use, i.e. easiness to apply the method correctly o Time Required to apply the method reliably o No Need for Special Equipment o Unobtrusiveness of the method In order to get an answer to the main research question, the following supporting research questions were answered first: What kinds of interviewing and interrogation techniques exist and how could they be used in the intelligence interview context, what kinds of lie detection and veracity assessment methods exist that are reliable and are based on scientific proof and what kind of uncertainty and other limitations are included in these methods? Two major databases, Google Scholar and Science Direct, were used to search and collect existing topic related studies and other papers. After the search phase, the understanding of the existing lie detection and veracity assessment methods was established through a meta-analysis. Multi Criteria Analysis utilizing Analytic Hierarchy Process was conducted to compare scientifically valid lie detection and veracity assessment methods in terms of the assessment criteria. In addition, a field study was arranged to get a firsthand experience of the applicability of different lie detection and veracity assessment methods. The Studied Features of Discourse and the Studied Features of Nonverbal Communication gained the highest ranking in overall applicability. They were assessed to be the easiest and fastest to apply, and to have required temporal and contextual sensitivity. The Plausibility and Inner Logic of the Statement, the Method for Assessing the Credibility of Evidence and the Criteria Based Content Analysis were also found to be useful, but with some limitations. The Discourse Analysis and the Polygraph were assessed to be the least applicable. Results from the field study support these findings. However, it was also discovered that the most applicable methods are not entirely troublefree either. In addition, this study highlighted that three channels of information, Content, Discourse and Nonverbal Communication, can be subjected to veracity assessment methods that are scientifically defensible. There is at least one reliable and applicable veracity assessment method for each of the three channels. All of the methods require disciplined application and a scientific working approach. There are no quick gains if high accuracy and reliability is desired. Since most of the current lie detection studies are concentrated around a scenario, where roughly half of the assessed people are totally truthful and the other half are liars who present a well prepared cover story, it is proposed that in future studies lie detection and veracity assessment methods are tested against partially truthful human sources. This kind of test setup would highlight new challenges and opportunities for the use of existing and widely studied lie detection methods, as well as for the modern ones that are still under development.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Patient information systems are crucial components for the modern healthcare and medicine. It is obvious that without them the healthcare cannot function properly – one can try to imagine how brain surgery could be done without using information systems to gather and show information needed for an operation. Thus, it can be stated that digital information is irremovable part of modern healthcare. However, the legal ownership of patient information lacks a coherent and justified basis. The whole issue itself is actually bypassed by controlling pa- tient information with different laws and regulations how patient information can be used and by whom. Nonetheless, the issue itself – who owns the patient in- formation – is commonly missed or bypassed. This dissertation show the problems if the legislation of patient information ownership is not clear. Without clear legislation, the outcome can be unexpected like it seems to be in Finland, Sweden and United Kingdom: the lack of clear regulation has come up with unwanted consequences because of problematic Eu- ropean Union database directive implementation in those countries. The legal ownership is actually granted to the creators of databases which contains the pa- tient information, and this is not a desirable situation. In healthcare and medicine, we are dealing with issues such as life, health and information which are very sensitive and in many cases very personal. Thus, this dissertation leans on four philosophical theories form Locke, Kant, Heidegger and Rawls to have an ethically justified basis for regulating the patient infor- mation in a proper way. Because of the problems of property and ownership in the context of information, a new concept is needed and presented to replace the concept of owning, that concept being Datenherrschaft (eng. mastery over in- formation). Datenherrschaft seems to be suitable for regulating patient infor- mation because its core is the protection of one’s right over information and this aligns with the work of the philosophers whose theories are used in the work. The philosophical argumentation of this study shows that Datenherrschaft granted to the patients is ethically acceptable. It supports the view that patient should be controlling the patient information about themselves unless there are such specific circumstance that justifies the authorities to use patient information to protect other people’s basic rights. Thus, if the patients would be legally grant- ed Datenherrschaft over patient information we would endorse patients as indi- viduals who have their own and personal experience of their own life and have a strong stance against any unjustified paternalism in healthcare. Keywords: patient information, ownership, Datenherrschaft, ethics, Locke, Kant, Heidegger, Rawls

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The research towards efficient, reliable and environmental-friendly power supply solutions is producing growing interest to the “Smart Grid” approach for the development of the electricity networks and managing the increasing energy consumption. One of the novel approaches is an LVDC microgrid. The purpose of the research is to analyze the possibilities for the implementation of LVDC microgrids in public distribution networks in Russia. The research contains the analysis of the modern Russian electric power industry, electricity market, electricity distribution business, regulatory framework and standardization, related to the implementation of LVDC microgrid concept. For the purpose of the economic feasibility estimation, a theoretical case study for comparing low voltage AC and medium voltage AC with LVDC microgrid solutions for a small settlement in Russia is presented. The results of the market and regulatory framework analysis along with the economic comparison of AC and DC solutions show that implementation of the LVDC microgrid concept in Russia is possible and can be economically feasible. From the electric power industry and regulatory framework point of view, there are no serious obstacles for the LVDC microgrids in Russian distribution networks. However, the most suitable use cases at the moment are expected to be found in the electrification of remote settlements, which are isolated from the Unified Energy System of Russia.