774 resultados para Information privacy Framework
Resumo:
Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2014
Resumo:
The study of knowledge transfer (KT) has been proceeding in parallel but independently in health services and in business, presenting an opportunity for synergy and sharing. This paper uses a survey of 32 empirical KT studies with their 96 uniquely named determinants of KT success to identify ten unique determinants for horizontal knowledge transfer success. These determinants, the outcome measure of Knowledge Use, and separate explicit and tacit transfer flows constitute the KT Framework, extending the work of previous KT framework authors. Our Framework was validated through a case study of the transfer of clinical practice guideline knowledge between the cardiac teams of selected Ontario hospitals, using a survey of senders and receivers developed from the KT literature. The study findings were: 8 of 10 determinants were supported by the Successful Transfer Hospitals; and 4 of 10 determinants were found to a higher degree in the Successful than non-Successful transfer hospitals. Taken together, the results show substantive support for the KT Framework determinants, indicating aggregate support of 9 of these determinants, but not the 10th - Knowledge Complexity. The transfer of tacit knowledge was found to be related to the transfer of the explicit knowledge and expressed as the transfer or recreation of resource profile and internal process tacit knowledge, where this tacit transfer did not require interactions between Sender and Receiver. This study provides managers with the building blocks to assess and improve the success rates of their knowledge transfers.
Resumo:
Thesis (Ph.D.)--University of Washington, 2015-12
Resumo:
The paper looks into the dynamics of information society policy and its implementation in the Greek context. It argues that information society development is a contested process, influenced by pre-existing state, economy and society relations. Based on this, it looks into the different aspects of the idiosyncratic path which the evolution of the Greek information society has followed, particularly after 2000. Using Bob Jessop's strategic-relational approach (SRA) to the state as an analytical framework and drawing on a number of in-depth interviews with relevant political actors, it provides insights into policy implementation by examining: the public management of information technology projects, how such projects were received in bureaucratic structures and practices, as well as the relationship between the state and the information and communication technology (ICT) sector in public procurement processes. The emphasis is on the period 2000–2008, during which a major operational programme on the information society in Greece was put into effect. The paper also touches upon the post-2008 experience, suggesting that information society developments might include dynamics operating independently and even in contradiction to the state agenda.
Resumo:
Thesis to obtain the Master of Science Degree in Computer Science and Engineering
Resumo:
The recent technological advancements and market trends are causing an interesting phenomenon towards the convergence of High-Performance Computing (HPC) and Embedded Computing (EC) domains. On one side, new kinds of HPC applications are being required by markets needing huge amounts of information to be processed within a bounded amount of time. On the other side, EC systems are increasingly concerned with providing higher performance in real-time, challenging the performance capabilities of current architectures. The advent of next-generation many-core embedded platforms has the chance of intercepting this converging need for predictable high-performance, allowing HPC and EC applications to be executed on efficient and powerful heterogeneous architectures integrating general-purpose processors with many-core computing fabrics. To this end, it is of paramount importance to develop new techniques for exploiting the massively parallel computation capabilities of such platforms in a predictable way. P-SOCRATES will tackle this important challenge by merging leading research groups from the HPC and EC communities. The time-criticality and parallelisation challenges common to both areas will be addressed by proposing an integrated framework for executing workload-intensive applications with real-time requirements on top of next-generation commercial-off-the-shelf (COTS) platforms based on many-core accelerated architectures. The project will investigate new HPC techniques that fulfil real-time requirements. The main sources of indeterminism will be identified, proposing efficient mapping and scheduling algorithms, along with the associated timing and schedulability analysis, to guarantee the real-time and performance requirements of the applications.
Resumo:
Dissertation presented to obtain the degree of Doctor in Electrical and Computer Engineering, specialization on Collaborative Enterprise Networks
Resumo:
This study aims to analyze and compare micro-firms’ organizational culture related to organizational performance. A case study methodology was used based on four firms, competitors among themselves in the Information Technology business, focusing on the years between 2008-2013. Findings pointed out many similarities to larger firms, but some specificities of micro-firms were found and propositions were defined: clan culture predominance is related to best performing micro-firms; the configuration of several culture types seemed to be the most suitable for obtaining good organizational results, provided that they do not focus only on hierarchy and market types of culture; the market culture predominance perception by employees is associated with low job satisfaction; and, after a certain time in business, micro-firms, as do larger companies, seek to standardize and control processes. Recognizing that organizational culture is considered important to firms’ results, this study sheds some light on that important factor for micro-firms.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
This paper presents a theoretical model to analyze the privacy issues around location based mobile business models. We report the results of an exploratory field experiment in Switzerland that assessed the factors driving user payoff in mobile business. We found that (1) the personal data disclosed has a negative effect on user payoff; (2) the amount of personalization available has a direct and positive effect, as well as a moderating effect on user payoff; (3) the amount of control over user's personal data has a direct and positive effect, as well as a moderating effect on user payoff. The results suggest that privacy protection could be the main value proposition in the B2C mobile market. From our theoretical model we derive a set of guidelines to design a privacy-friendly business model pattern for third-party services. We discuss four examples to show the mobile platform can play a key role in the implementation of these new business models.
Resumo:
Tämän tutkielman tavoitteena on perehtyä globaalin yrityksen laskentatoimen tietojärjestelmien yhtenäistämiseen ja käyttöönottoon esimerkkinä UPM-Kymmene konsernin projekti. Tutkielmassa sovelletaan hermeneuttista tutkimusotetta. Teoreettisesti aihetta tarkastellaan globalisoitumisen ja laskentatoimen tietojärjestelmille asetettavien vaatimusten pohjalta, sekä järjestelmän muutosprosessin eri vaiheissa huomioon otettavien muuttujien perusteella. Yhtenäisen laskentatoimen tietojärjestelmän tuomat edut globaalille yritykselle ovat ilmeisiä. Ennen yhtenäisen projektin kehittelyä on olennaista tutkia lähtökohdat projektin onnistumiselle ja suunnitella projektin eri vaiheet huolella. Tutkielmassa havaitaan myös, että globaalissa yrityksessä tulee huomioida eri yrityskulttuurit sekä tulosyksiköiden erilaiset toimintatavat. Johtopäätöksenä todetaan, että sekä yritysjohdon että tulosyksiköiden sitoutuneisuus projektiin ja yhtenäiset tavoitteet ovat oleellisia projektin onnistumisen kannalta.
Resumo:
Data management consists of collecting, storing, and processing the data into the format which provides value-adding information for decision-making process. The development of data management has enabled of designing increasingly effective database management systems to support business needs. Therefore as well as advanced systems are designed for reporting purposes, also operational systems allow reporting and data analyzing. The used research method in the theory part is qualitative research and the research type in the empirical part is case study. Objective of this paper is to examine database management system requirements from reporting managements and data managements perspectives. In the theory part these requirements are identified and the appropriateness of the relational data model is evaluated. In addition key performance indicators applied to the operational monitoring of production are studied. The study has revealed that the appropriate operational key performance indicators of production takes into account time, quality, flexibility and cost aspects. Especially manufacturing efficiency has been highlighted. In this paper, reporting management is defined as a continuous monitoring of given performance measures. According to the literature review, the data management tool should cover performance, usability, reliability, scalability, and data privacy aspects in order to fulfill reporting managements demands. A framework is created for the system development phase based on requirements, and is used in the empirical part of the thesis where such a system is designed and created for reporting management purposes for a company which operates in the manufacturing industry. Relational data modeling and database architectures are utilized when the system is built for relational database platform.
Resumo:
The model studies information sharing and the stability of cooperation in cost reducing Research Joint Ventures (RJVs). In a four-stage game-theoretic framework, firms decide on participation in a RJV, information sharing, R&D expenditures, and output. An important feature of the model is that voluntary information sharing between cooperating firms increases information leakage from the RJV to outsiders. It is found that it is the spillover from the RJV to outsiders which determines the decision of insiders whether to share information, while it is the spillover affecting all firms which determines the level of information sharing within the RJV. RJVs representing a larger portion of firms in the industry are more likely to share information. It is also found that when sharing information is costless, firms never choose intermediate levels of information sharing : they share all the information or none at all. The size of the RJV is found to depend on three effects : a coordination effect, an information sharing effect, and a competition effect. Depending on the relative magnitudes of these effects, the size of the RJV may increase or decrease with spillovers. The effect of information sharing on the profitability of firms as well as on welfare is studied.
Resumo:
In the last decade, the potential macroeconomic effects of intermittent large adjustments in microeconomic decision variables such as prices, investment, consumption of durables or employment – a behavior which may be justified by the presence of kinked adjustment costs – have been studied in models where economic agents continuously observe the optimal level of their decision variable. In this paper, we develop a simple model which introduces infrequent information in a kinked adjustment cost model by assuming that agents do not observe continuously the frictionless optimal level of the control variable. Periodic releases of macroeconomic statistics or dividend announcements are examples of such infrequent information arrivals. We first solve for the optimal individual decision rule, that is found to be both state and time dependent. We then develop an aggregation framework to study the macroeconomic implications of such optimal individual decision rules. Our model has the distinct characteristic that a vast number of agents tend to act together, and more so when uncertainty is large. The average effect of an aggregate shock is inversely related to its size and to aggregate uncertainty. We show that these results differ substantially from the ones obtained with full information adjustment cost models.