981 resultados para Eclipse modeling framework (EMF)
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.
Resumo:
The rapid growth of big cities has been noticed since 1950s when the majority of world population turned to live in urban areas rather than villages, seeking better job opportunities and higher quality of services and lifestyle circumstances. This demographic transition from rural to urban is expected to have a continuous increase. Governments, especially in less developed countries, are going to face more challenges in different sectors, raising the essence of understanding the spatial pattern of the growth for an effective urban planning. The study aimed to detect, analyse and model the urban growth in Greater Cairo Region (GCR) as one of the fast growing mega cities in the world using remote sensing data. Knowing the current and estimated urbanization situation in GCR will help decision makers in Egypt to adjust their plans and develop new ones. These plans should focus on resources reallocation to overcome the problems arising in the future and to achieve a sustainable development of urban areas, especially after the high percentage of illegal settlements which took place in the last decades. The study focused on a period of 30 years; from 1984 to 2014, and the major transitions to urban were modelled to predict the future scenarios in 2025. Three satellite images of different time stamps (1984, 2003 and 2014) were classified using Support Vector Machines (SVM) classifier, then the land cover changes were detected by applying a high level mapping technique. Later the results were analyzed for higher accurate estimations of the urban growth in the future in 2025 using Land Change Modeler (LCM) embedded in IDRISI software. Moreover, the spatial and temporal urban growth patterns were analyzed using statistical metrics developed in FRAGSTATS software. The study resulted in an overall classification accuracy of 96%, 97.3% and 96.3% for 1984, 2003 and 2014’s map, respectively. Between 1984 and 2003, 19 179 hectares of vegetation and 21 417 hectares of desert changed to urban, while from 2003 to 2014, the transitions to urban from both land cover classes were found to be 16 486 and 31 045 hectares, respectively. The model results indicated that 14% of the vegetation and 4% of the desert in 2014 will turn into urban in 2025, representing 16 512 and 24 687 hectares, respectively.
Resumo:
Nowadays, the consumption of goods and services on the Internet are increasing in a constant motion. Small and Medium Enterprises (SMEs) mostly from the traditional industry sectors are usually make business in weak and fragile market sectors, where customized products and services prevail. To survive and compete in the actual markets they have to readjust their business strategies by creating new manufacturing processes and establishing new business networks through new technological approaches. In order to compete with big enterprises, these partnerships aim the sharing of resources, knowledge and strategies to boost the sector’s business consolidation through the creation of dynamic manufacturing networks. To facilitate such demand, it is proposed the development of a centralized information system, which allows enterprises to select and create dynamic manufacturing networks that would have the capability to monitor all the manufacturing process, including the assembly, packaging and distribution phases. Even the networking partners that come from the same area have multi and heterogeneous representations of the same knowledge, denoting their own view of the domain. Thus, different conceptual, semantic, and consequently, diverse lexically knowledge representations may occur in the network, causing non-transparent sharing of information and interoperability inconsistencies. The creation of a framework supported by a tool that in a flexible way would enable the identification, classification and resolution of such semantic heterogeneities is required. This tool will support the network in the semantic mapping establishments, to facilitate the various enterprises information systems integration.
Resumo:
As the complexity of markets and the dynamicity of systems evolve, the need for interoperable systems capable of strengthening enterprise communication effectiveness increases. This is particularly significant when it comes to collaborative enterprise networks, like manufacturing supply chains, where several companies work, communicate, and depend on each other, in order to achieve a specific goal. Once interoperability is achieved, that is once all network parties are able to communicate with and understand each other, organisations are able to exchange information along a stable environment that follows agreed laws. However, as markets adapt to new requirements and demands, an evolutionary behaviour is triggered giving space to interoperability problems, thus disrupting the sustainability of interoperability and raising the need to develop monitoring activities capable of detecting and preventing unexpected behaviour. This work seeks to contribute to the development of monitoring techniques for interoperable SOA-based enterprise networks. It focuses on the automatic detection of harmonisation breaking events during real-time communications, and strives to develop and propose a methodological approach to handle these disruptions with minimal or no human intervention, hence providing existing service-based networks with the ability to detect and promptly react to interoperability issues.
Resumo:
A potentially renewable and sustainable source of energy is the chemical energy associated with solvation of salts. Mixing of two aqueous streams with different saline concentrations is spontaneous and releases energy. The global theoretically obtainable power from salinity gradient energy due to World’s rivers discharge into the oceans has been estimated to be within the range of 1.4-2.6 TW. Reverse electrodialysis (RED) is one of the emerging, membrane-based, technologies for harvesting the salinity gradient energy. A common RED stack is composed by alternately-arranged cation- and anion-exchange membranes, stacked between two electrodes. The compartments between the membranes are alternately fed with concentrated (e.g., sea water) and dilute (e.g., river water) saline solutions. Migration of the respective counter-ions through the membranes leads to ionic current between the electrodes, where an appropriate redox pair converts the chemical salinity gradient energy into electrical energy. Given the importance of the need for new sources of energy for power generation, the present study aims at better understanding and solving current challenges, associated with the RED stack design, fluid dynamics, ionic mass transfer and long-term RED stack performance with natural saline solutions as feedwaters. Chronopotentiometry was used to determinate diffusion boundary layer (DBL) thickness from diffusion relaxation data and the flow entrance effects on mass transfer were found to avail a power generation increase in RED stacks. Increasing the linear flow velocity also leads to a decrease of DBL thickness but on the cost of a higher pressure drop. Pressure drop inside RED stacks was successfully simulated by the developed mathematical model, in which contribution of several pressure drops, that until now have not been considered, was included. The effect of each pressure drop on the RED stack performance was identified and rationalized and guidelines for planning and/or optimization of RED stacks were derived. The design of new profiled membranes, with a chevron corrugation structure, was proposed using computational fluid dynamics (CFD) modeling. The performance of the suggested corrugation geometry was compared with the already existing ones, as well as with the use of conductive and non-conductive spacers. According to the estimations, use of chevron structures grants the highest net power density values, at the best compromise between the mass transfer coefficient and the pressure drop values. Finally, long-term experiments with natural waters were performed, during which fouling was experienced. For the first time, 2D fluorescence spectroscopy was used to monitor RED stack performance, with a dedicated focus on following fouling on ion-exchange membrane surfaces. To extract relevant information from fluorescence spectra, parallel factor analysis (PARAFAC) was performed. Moreover, the information obtained was then used to predict net power density, stack electric resistance and pressure drop by multivariate statistical models based on projection to latent structures (PLS) modeling. The use in such models of 2D fluorescence data, containing hidden, but extractable by PARAFAC, information about fouling on membrane surfaces, considerably improved the models fitting to the experimental data.
Resumo:
The aim of this work project is to find a model that is able to accurately forecast the daily Value-at-Risk for PSI-20 Index, independently of the market conditions, in order to expand empirical literature for the Portuguese stock market. Hence, two subsamples, representing more and less volatile periods, were modeled through unconditional and conditional volatility models (because it is what drives returns). All models were evaluated through Kupiec’s and Christoffersen’s tests, by comparing forecasts with actual results. Using an out-of-sample of 204 observations, it was found that a GARCH(1,1) is an accurate model for our purposes.
Resumo:
The life of humans and most living beings depend on sensation and perception for the best assessment of the surrounding world. Sensorial organs acquire a variety of stimuli that are interpreted and integrated in our brain for immediate use or stored in memory for later recall. Among the reasoning aspects, a person has to decide what to do with available information. Emotions are classifiers of collected information, assigning a personal meaning to objects, events and individuals, making part of our own identity. Emotions play a decisive role in cognitive processes as reasoning, decision and memory by assigning relevance to collected information. The access to pervasive computing devices, empowered by the ability to sense and perceive the world, provides new forms of acquiring and integrating information. But prior to data assessment on its usefulness, systems must capture and ensure that data is properly managed for diverse possible goals. Portable and wearable devices are now able to gather and store information, from the environment and from our body, using cloud based services and Internet connections. Systems limitations in handling sensorial data, compared with our sensorial capabilities constitute an identified problem. Another problem is the lack of interoperability between humans and devices, as they do not properly understand human’s emotional states and human needs. Addressing those problems is a motivation for the present research work. The mission hereby assumed is to include sensorial and physiological data into a Framework that will be able to manage collected data towards human cognitive functions, supported by a new data model. By learning from selected human functional and behavioural models and reasoning over collected data, the Framework aims at providing evaluation on a person’s emotional state, for empowering human centric applications, along with the capability of storing episodic information on a person’s life with physiologic indicators on emotional states to be used by new generation applications.
Resumo:
Organizations are undergoing serious difficulties to retain talent. Authors argue that Talent Management (TM) practices create beneficial outcomes for individuals and organizations. However, there is no research on the leaders’ role in the functioning of these practices. This study examines how LMX and role modeling influence the impact that TM practices have on employees’ trust in their organizations and retention. The analysis of two questionnaires (Nt1=175; Nt2=107) indicated that TM only reduced turnover intentions, via an increase in trust in the organization, when role modeling was high and not when it was low. Therefore, we can say that leaders are crucial in the TM context, and in sustaining a competitive advantage for organizations.
Resumo:
Existing wireless networks are characterized by a fixed spectrum assignment policy. However, the scarcity of available spectrum and its inefficient usage demands for a new communication paradigm to exploit the existing spectrum opportunistically. Future Cognitive Radio (CR) devices should be able to sense unoccupied spectrum and will allow the deployment of real opportunistic networks. Still, traditional Physical (PHY) and Medium Access Control (MAC) protocols are not suitable for this new type of networks because they are optimized to operate over fixed assigned frequency bands. Therefore, novel PHY-MAC cross-layer protocols should be developed to cope with the specific features of opportunistic networks. This thesis is mainly focused on the design and evaluation of MAC protocols for Decentralized Cognitive Radio Networks (DCRNs). It starts with a characterization of the spectrum sensing framework based on the Energy-Based Sensing (EBS) technique considering multiple scenarios. Then, guided by the sensing results obtained by the aforementioned technique, we present two novel decentralized CR MAC schemes: the first one designed to operate in single-channel scenarios and the second one to be used in multichannel scenarios. Analytical models for the network goodput, packet service time and individual transmission probability are derived and used to compute the performance of both protocols. Simulation results assess the accuracy of the analytical models as well as the benefits of the proposed CR MAC schemes.
Resumo:
This case-study examined the use of the BeGloCal Framework applied to B2C E-commerce, for a fast moving consumer goods European manufacturing firm. It explains how the framework supported the team within the company to identify the right local market as to where to start the project, the problem for the company was to find the most appealing area to invest resources. By going through all the steps of the framework the findings led the company to London (Kensington and Chelsea). It shows how managers should act when they have to find a trade-off between standardization and adaptation.
Resumo:
The purpose of this work is to develop a practicable approach for Telecom firms to manage the credit risk exposition to their commercial agents’ network. Particularly it will try to approach the problem of credit concession to clients’ from a corporation perspective and explore the particular scenario of agents that are part of the commercial chain of the corporation and therefore are not end-users. The agents’ network that served as a model for the presented study is composed by companies that, at the same time, are both clients and suppliers of the Telecommunication Company. In that sense the credit exposition analysis must took into consideration all financial fluxes, both inbound and outbound. The current strain on the Financial Sector in Portugal, and other peripheral European economies, combined with the high leverage situation of most companies, generates an environment prone to credit default risk. Due to these circumstances managing credit risk exposure is becoming increasingly a critical function for every company Financial Department. The approach designed in the current study combined two traditional risk monitoring tools: credit risk scoring and credit limitation policies. The objective was to design a new credit monitoring framework that is more flexible, uses both external and internal relationship history to assess risk and takes into consideration commercial objectives inside the agents’ network. Although not explored at length, the blueprint of a Credit Governance model was created for implementing the new credit monitoring framework inside the telecom firm. The Telecom Company that served as a model for the present work decided to implement the new Credit Monitoring framework after this was presented to its Executive Commission.
Resumo:
This project is based on the theme of capacity-building in social organisations to improve their impact readiness, which is the predictability of delivering intended outcomes. All organisations which have a social mission, non-profit or for-profit, will be considered to fall within the social sector for the purpose of this work. The thesis will look at (i) what is impact readiness and what are the considerations for building impact readiness in social organisations, (ii) what is the international benchmark in measuring and building impact readiness, (iii) understand the impact readiness of Portuguese social organisations and the supply of capacity building for social impact in Portugal currently, and (iv) provide recommendations on the design of a framework for capacity building for impact readiness adapted to the Portuguese context. This work is of particular relevance to the Social Investment Laboratory, which is a sponsor of this project, in its policy work as part of the Portuguese Social Investment Taskforce (the “Taskforce”). This in turn will inform its contribution to the set-up of Portugal Inovação Social, a wholesaler catalyst entity of social innovation and social investment in the country, launched in early 2015. Whilst the output of this work will be set a recommendations for wider application for capacity-building programmes in Portugal, Portugal Inovação Social will also clearly have a role in coordinating the efforts of market players – foundations, corporations, public sector and social organisations – in implementing these recommendations. In addition, the findings of this report could have relevance to other countries seeking to design capacity building frameworks in their local markets and to any impact-driven organisations with an interest in enhancing the delivery of impact within their work.
Resumo:
This thesis is a case study on Corporate Governance and Business Ethics, using the Portuguese Corporate Law as a general setting. The thesis was conducted in Portugal with illustrations on past cases under the Business Judgment Rule of the State of Delaware, U.SA along with illustrations on current cases in Portugal under the Portuguese Judicial setting, along with a comparative analysis between both. A debate is being considered among scholars and executives; a debate on best practices within corporate governance and corporate law, associated with recent discoveries of unlawful investments that lead to the bankruptcy of leading institutions and an aggravation of the crisis in Portugal. The study aimed at learning possible reasons and causes for the current situation of the country’s corporations along with attempts to discover the best way to move forward. From the interviews and analysis conducted, this paper concluded that the corporate governance structure and legal frameworks in Portugal were not the sole influencers behind the actions and decisions of Corporate Executives, nor were they the main triggers for the recent corporate mishaps. But it is rather a combination of different factors that played a significant role, such as cultural and ethical aspects, individual personalities, and others all of which created gray areas beyond the legal structure, which in turn accelerated and aggravated the corporate governance crisis in the country.
Resumo:
The assessment of existing timber structures is often limited to information obtained from non or semi destructive testing, as mechanical testing is in many cases not possible due to its destructive nature. Therefore, the available data provides only an indirect measurement of the reference mechanical properties of timber elements, often obtained through empirical based correlations. Moreover, the data must result from the combination of different tests, as to provide a reliable source of information for a structural analysis. Even if general guidelines are available for each typology of testing, there is still a need for a global methodology allowing to combine information from different sources and infer upon that information in a decision process. In this scope, the present work presents the implementation of a probabilistic based framework for safety assessment of existing timber elements. This methodology combines information gathered in different scales and follows a probabilistic framework allowing for the structural assessment of existing timber elements with possibility of inference and updating of its mechanical properties, through Bayesian methods. The probabilistic based framework is based in four main steps: (i) scale of information; (ii) measurement data; (iii) probability assignment; and (iv) structural analysis. In this work, the proposed methodology is implemented in a case study. Data was obtained through a multi-scale experimental campaign made to old chestnut timber beams accounting correlations of non and semi-destructive tests with mechanical properties. Finally, different inference scenarios are discussed aiming at the characterization of the safety level of the elements.
Resumo:
This study presents an experimental program to assess the tensile strain distribution along prestressed carbon fiber reinforced polymer (CFRP) reinforcement flexurally applied on the tensile surface of RC beams according to near surface mounted (NSM) technique. Moreover, the current study aims to propose an analytical formulation, with a design framework, for the prediction of distribution of CFRP tensile strain and bond shear stress and, additionally, the prestress transfer length. After demonstration the good predictive performance of the proposed analytical approach, parametric studies were carried out to analytically evaluate the influence of the main material properties, and CFRP and groove cross section on the distribution of the CFRP tensile strain and bond shear stress, and on the prestress transfer length. The proposed analytical approach can also predict the evolution of the prestress transfer length during the curing time of the adhesive by considering the variation of its elasticity modulus during this period.