90 resultados para information gap
Resumo:
25th International Cryogenic Engineering Conference and the International Cryogenic Materials Conference in 2014, ICEC 25–ICMC 2014
Resumo:
This work project focuses on developing new approaches which enhance Portuguese exports towards a defined German industry sector within the information technology and electronics fields. Firstly and foremost, information was collected and a set of expert and top managers’ interviews were performed in order to acknowledge the demand of the German market while identifying compatible Portuguese supply capabilities. Among the main findings, Industry 4.0 presents itself as a valuable opportunity in the German market for Portuguese medium sized companies in the embedded systems area of expertise for machinery and equipment companies. In order to achieve the purpose of the work project, an embedded systems platform targeting machinery and equipment companies was suggested as well as it was developed several recommendations on how to implement it. An alternative approach for this platform was also considered within the German market namely the eHealth sector having the purpose of enhancing the current healthcare service provision.
Resumo:
Increasing disparity between executive compensation and that of the average worker (the pay gap) has generated a fierce debate about its causes and effects. This paper studies the determinants and performance effects of the pay gap through the prism of Tournament Incentives and the Equity Fairness Theory. Results show that the size of the pay gap is caused primarily by the size of the firm and by the standards of its industry and also by the unionization rate and whether the Chairman is also the CEO. The paper Concludes by showins that the pay gap has a positive effect on firm performance in the United States Keywords:
Resumo:
According to a recent Eurobarometer survey (2014), 68% of Europeans tend not to trust national governments. As the increasing alienation of citizens from politics endangers democracy and welfare, governments, practitioners and researchers look for innovative means to engage citizens in policy matters. One of the measures intended to overcome the so-called democratic deficit is the promotion of civic participation. Digital media proliferation offers a set of novel characteristics related to interactivity, ubiquitous connectivity, social networking and inclusiveness that enable new forms of societal-wide collaboration with a potential impact on leveraging participative democracy. Following this trend, e-Participation is an emerging research area that consists in the use of Information and Communication Technologies to mediate and transform the relations among citizens and governments towards increasing citizens’ participation in public decision-making. However, despite the widespread efforts to implement e-Participation through research programs, new technologies and projects, exhaustive studies on the achieved outcomes reveal that it has not yet been successfully incorporated in institutional politics. Given the problems underlying e-Participation implementation, the present research suggested that, rather than project-oriented efforts, the cornerstone for successfully implementing e-Participation in public institutions as a sustainable added-value activity is a systematic organisational planning, embodying the principles of open-governance and open-engagement. It further suggested that BPM, as a management discipline, can act as a catalyst to enable the desired transformations towards value creation throughout the policy-making cycle, including political, organisational and, ultimately, citizen value. Following these findings, the primary objective of this research was to provide an instrumental model to foster e-Participation sustainability across Government and Public Administration towards a participatory, inclusive, collaborative and deliberative democracy. The developed artefact, consisting in an e-Participation Organisational Semantic Model (ePOSM) underpinned by a BPM-steered approach, introduces this vision. This approach to e-Participation was modelled through a semi-formal lightweight ontology stack structured in four sub-ontologies, namely e-Participation Strategy, Organisational Units, Functions and Roles. The ePOSM facilitates e-Participation sustainability by: (1) Promoting a common and cross-functional understanding of the concepts underlying e-Participation implementation and of their articulation that bridges the gap between technical and non-technical users; (2) Providing an organisational model which allows a centralised and consistent roll-out of strategy-driven e-Participation initiatives, supported by operational units dedicated to the execution of transformation projects and participatory processes; (3) Providing a standardised organisational structure, goals, functions and roles related to e-Participation processes that enhances process-level interoperability among government agencies; (4) Providing a representation usable in software development for business processes’ automation, which allows advanced querying using a reasoner or inference engine to retrieve concrete and specific information about the e-Participation processes in place. An evaluation of the achieved outcomes, as well a comparative analysis with existent models, suggested that this innovative approach tackling the organisational planning dimension can constitute a stepping stone to harness e-Participation value.
Resumo:
Based in internet growth, through semantic web, together with communication speed improvement and fast development of storage device sizes, data and information volume rises considerably every day. Because of this, in the last few years there has been a growing interest in structures for formal representation with suitable characteristics, such as the possibility to organize data and information, as well as the reuse of its contents aimed for the generation of new knowledge. Controlled Vocabulary, specifically Ontologies, present themselves in the lead as one of such structures of representation with high potential. Not only allow for data representation, as well as the reuse of such data for knowledge extraction, coupled with its subsequent storage through not so complex formalisms. However, for the purpose of assuring that ontology knowledge is always up to date, they need maintenance. Ontology Learning is an area which studies the details of update and maintenance of ontologies. It is worth noting that relevant literature already presents first results on automatic maintenance of ontologies, but still in a very early stage. Human-based processes are still the current way to update and maintain an ontology, which turns this into a cumbersome task. The generation of new knowledge aimed for ontology growth can be done based in Data Mining techniques, which is an area that studies techniques for data processing, pattern discovery and knowledge extraction in IT systems. This work aims at proposing a novel semi-automatic method for knowledge extraction from unstructured data sources, using Data Mining techniques, namely through pattern discovery, focused in improving the precision of concept and its semantic relations present in an ontology. In order to verify the applicability of the proposed method, a proof of concept was developed, presenting its results, which were applied in building and construction sector.
Resumo:
In the early nineties, Mark Weiser wrote a series of seminal papers that introduced the concept of Ubiquitous Computing. According to Weiser, computers require too much attention from the user, drawing his focus from the tasks at hand. Instead of being the centre of attention, computers should be so natural that they would vanish into the human environment. Computers become not only truly pervasive but also effectively invisible and unobtrusive to the user. This requires not only for smaller, cheaper and low power consumption computers, but also for equally convenient display solutions that can be harmoniously integrated into our surroundings. With the advent of Printed Electronics, new ways to link the physical and the digital worlds became available. By combining common printing techniques such as inkjet printing with electro-optical functional inks, it is starting to be possible not only to mass-produce extremely thin, flexible and cost effective electronic circuits but also to introduce electronic functionalities into products where it was previously unavailable. Indeed, Printed Electronics is enabling the creation of novel sensing and display elements for interactive devices, free of form factor. At the same time, the rise in the availability and affordability of digital fabrication technologies, namely of 3D printers, to the average consumer is fostering a new industrial (digital) revolution and the democratisation of innovation. Nowadays, end-users are already able to custom design and manufacture on demand their own physical products, according to their own needs. In the future, they will be able to fabricate interactive digital devices with user-specific form and functionality from the comfort of their homes. This thesis explores how task-specific, low computation, interactive devices capable of presenting dynamic visual information can be created using Printed Electronics technologies, whilst following an approach based on the ideals behind Personal Fabrication. Focus is given on the use of printed electrochromic displays as a medium for delivering dynamic digital information. According to the architecture of the displays, several approaches are highlighted and categorised. Furthermore, a pictorial computation model based on extended cellular automata principles is used to programme dynamic simulation models into matrix-based electrochromic displays. Envisaged applications include the modelling of physical, chemical, biological, and environmental phenomena.
Resumo:
Equity research report
Resumo:
Flow of new information is what produces price changes, understanding if the market is unbalanced is fundamental to know how much inventory market makers should keep during an important economic release. After identifying which economic indicators impact the S&P and 10 year Treasuries. The Volume Synchronized Probability of Information-Based Trading (VPIN) will be used as a predictability measure. The results point to some predictability power over economic surprises of the VPIN metric, mainly when calculated using the S&P. This finding appears to be supported when analysing depth imbalance before economic releases. Inferior results were achieved when using treasuries. The final aim of this study is to fill the gap between microstructural changes and macroeconomic events.
Resumo:
This work presents research conducted to understand the role of indicators in decisions of technology innovation. A gap was detected in the literature of innovation and technology assessment about the use and influence of indicators in this type of decision. It was important to address this gap because indicators are often frequent elements of innovation and technology assessment studies. The research was designed to determine the extent of the use and influence of indicators in decisions of technology innovation, to characterize the role of indicators in these decisions, and to understand how indicators are used in these decisions. The latter involved the test of four possible explanatory factors: the type and phase of decision, and the context and process of construction of evidence. Furthermore, it focused on three Portuguese innovation groups: public researchers, business R&D&I leaders and policymakers. The research used a combination of methods to collect quantitative and qualitative information, such as surveys, case studies and social network analysis. This research concluded that the use of indicators is different from their influence in decisions of technology innovation. In fact, there is a high use of indicators in these decisions, but lower and differentiated differences in their influence in each innovation group. This suggests that political-behavioural methods are also involved in the decisions to different degrees. The main social influences in the decisions came mostly from hierarchies, knowledge-based contacts and users. Furthermore, the research established that indicators played mostly symbolic roles in decisions of policymakers and business R&D&I leaders, although their role with researchers was more differentiated. Indicators were also described as helpful instruments to conduct a reasonable interpretation of data and to balance options in innovation and technology assessments studies, in particular when contextualised, described in detail and with discussion upon the options made. Results suggest that there are four main explanatory factors for the role of indicators in these decisions: First, the type of decision appears to be a factor to consider when explaining the role of indicators. In fact, each type of decision had different influences on the way indicators are used, and each type of decision used different types of indicators. Results for policy-making were particularly different from decisions of acquisition and development of products/technology. Second, the phase of the decision can help to understand the role indicators play in these decisions. Results distinguished between two phases detected in all decisions – before and after the decision – as well as two other phases that can be used to complement the decision process and where indicators can be involved. Third, the context of decision is an important factor to consider when explaining the way indicators are taken into consideration in policy decisions. In fact, the role of indicators can be influenced by the particular context of the decision maker, in which all types of evidence can be selected or downplayed. More importantly, the use of persuasive analytical evidence appears to be related with the dispute existent in the policy context. Fourth and last, the process of construction of evidence is a factor to consider when explaining the way indicators are involved in these decisions. In fact, indicators and other evidence were brought to the decision processes according to their availability and capacity to support the different arguments and interests of the actors and stakeholders. In one case, an indicator lost much persuasion strength with the controversies that it went through during the decision process. Therefore, it can be argued that the use of indicators is high but not very influential; their role is mostly symbolic to policymakers and business decisions, but varies among researchers. The role of indicators in these decisions depends on the type and phase of the decision and the context and process of construction of evidence. The latter two are related to the particular context of each decision maker, the existence of elements of dispute and controversies that influence the way indicators are introduced in the decision-making process.
Resumo:
This thesis examines the effects of macroeconomic factors on inflation level and volatility in the Euro Area to improve the accuracy of inflation forecasts with econometric modelling. Inflation aggregates for the EU as well as inflation levels of selected countries are analysed, and the difference between these inflation estimates and forecasts are documented. The research proposes alternative models depending on the focus and the scope of inflation forecasts. I find that models with a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) in mean process have better explanatory power for inflation variance compared to the regular GARCH models. The significant coefficients are different in EU countries in comparison to the aggregate EU-wide forecast of inflation. The presence of more pronounced GARCH components in certain countries with more stressed economies indicates that inflation volatility in these countries are likely to occur as a result of the stressed economy. In addition, other economies in the Euro Area are found to exhibit a relatively stable variance of inflation over time. Therefore, when analysing EU inflation one have to take into consideration the large differences on country level and focus on those one by one.
Resumo:
Information systems are widespread and used by anyone with computing devices as well as corporations and governments. It is often the case that security leaks are introduced during the development of an application. Reasons for these security bugs are multiple but among them one can easily identify that it is very hard to define and enforce relevant security policies in modern software. This is because modern applications often rely on container sharing and multi-tenancy where, for instance, data can be stored in the same physical space but is logically mapped into different security compartments or data structures. In turn, these security compartments, to which data is classified into in security policies, can also be dynamic and depend on runtime data. In this thesis we introduce and develop the novel notion of dependent information flow types, and focus on the problem of ensuring data confidentiality in data-centric software. Dependent information flow types fit within the standard framework of dependent type theory, but, unlike usual dependent types, crucially allow the security level of a type, rather than just the structural data type itself, to depend on runtime values. Our dependent function and dependent sum information flow types provide a direct, natural and elegant way to express and enforce fine grained security policies on programs. Namely programs that manipulate structured data types in which the security level of a structure field may depend on values dynamically stored in other fields The main contribution of this work is an efficient analysis that allows programmers to verify, during the development phase, whether programs have information leaks, that is, it verifies whether programs protect the confidentiality of the information they manipulate. As such, we also implemented a prototype typechecker that can be found at http://ctp.di.fct.unl.pt/DIFTprototype/.
Resumo:
This study is specifically concerned with the effect of the Enterprise Resource Planning (ERP) on the Business Process Redesign (BPR). Researcher’s experience and the investigation on previous researches imply that BPR and ERP are deeply related to each other and a study to found the mentioned relation further is necessary. In order to elaborate the hypothesis, a case study, in particular Turkish electricity distribution market and the phase of privatization are investigated. Eight companies that have taken part in privatization process and executed BPR serve as cases in this study. During the research, the cases are evaluated through critical success factors on both BPR and ERP. It was seen that combining the ERP Solution features with business processes lead the companies to be successful in ERP and BPR implementation. When the companies’ success and efficiency were compared before and after the ERP implementation, a considerable change was observed in organizational structure. It was spotted that the team composition is important in the success of ERP projects. Additionally, when the ERP is in driver or enabler role, the companies can be considered successful. On the contrary, when the ERP has a neutral role of business processes, the project fails. In conclusion, it can be said that the companies, which have implemented the ERP successfully, have accomplished the goals of the BPR.
Resumo:
Urban mobility is one of the main challenges facing urban areas due to the growing population and to traffic congestion, resulting in environmental pressures. The pathway to urban sustainable mobility involves strengthening of intermodal mobility. The integrated use of different transport modes is getting more and more important and intermodality has been mentioned as a way for public transport compete with private cars. The aim of the current dissertation is to define a set of strategies to improve urban mobility in Lisbon and by consequence reduce the environmental impacts of transports. In order to do that several intermodal practices over Europe were analysed and the transport systems of Brussels and Lisbon were studied and compared, giving special attention to intermodal systems. In the case study was gathered data from both cities in the field, by using and observing the different transport modes, and two surveys were done to the cities users. As concluded by the study, Brussels and Lisbon present significant differences. In Brussels the measures to promote intermodality are evident, while in Lisbon a lot still needs to be done. It also made clear the necessity for improvements in Lisbon’s public transports to a more intermodal passenger transport system, through integration of different transport modes and better information and ticketing system. Some of the points requiring developments are: interchanges’ waiting areas; integration of bicycle in public transport; information about correspondences with other transport modes; real-time information to passengers pre-trip and on-trip, especially in buses and trams. After the identification of the best practices in Brussels and the weaknesses in Lisbon the possibility of applying some of the practices in Brussels to Lisbon was evaluated. Brussels demonstrated to be a good example of intermodality and for that reason some of the recommendations to improve intermodal mobility in Lisbon can follow the practices in place in Brussels.