18 resultados para Macro-programming


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do Grau de Mestre em Engenharia Informática.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para a obtenção do grau de Mestre em Engenharia Informática.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work project is to analyse and discuss the importance of the “Cost to Serve” as a differentiation key factor, by accessing cost to serve customers of a Portuguese subsidiary of a multinational company, which is operating in the sector of fast moving consumer goods (FMCG) – Unilever – Jerónimo Martins (UJM). I will also suggest and quantify key proposals to decrease costs and increase customers’ value. Hence, the scope of this work project is focused on logistics and distribution processes of the company supply chain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study uses a VAR methodology to evaluate the impact of the macroeconomic conditions and money supply in the fluctuation of nonperforming loans for the Portuguese economy. Additionally, the feedback effect of nonperforming loans growth to the economy and specially to the credit supply is analised. The study is motived by the hypothesis that loan quality is procyclical and that the fast growth of credit supply has a positive relation with the growth of nonperforming loans. The hypothesis that nonperforming loans reinforce economic fragilities and credit market frictions is also tested. Empirical results corroborate both hypothesis presented. Hence, it was possible to establish that the macroeconomic conditions measured by GDP and unemployment and the fast growth of credit supply contribute to the development of nonperforming loans. Furthermore, the growth of nonperforming loans reinforces the economic cycle, as it contributes to the deterioration of macroeconomic conditions and creates frictions in the credit market that may results in a credit crunch.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study proposes a systematic model that is able to fit the Global Macro Investing universe. The Analog Model tests the possibility of capturing the likelihood of an optimal investment allocation based on similarity across different periods in history. Instead of observing Macroeconomic data, the model uses financial markets’ variables to classify unknown short-term regimes. This methodology is particularly relevant considering that asset classes and investment strategies react differently to specific macro environment shifts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Machine ethics is an interdisciplinary field of inquiry that emerges from the need of imbuing autonomous agents with the capacity of moral decision-making. While some approaches provide implementations in Logic Programming (LP) systems, they have not exploited LP-based reasoning features that appear essential for moral reasoning. This PhD thesis aims at investigating further the appropriateness of LP, notably a combination of LP-based reasoning features, including techniques available in LP systems, to machine ethics. Moral facets, as studied in moral philosophy and psychology, that are amenable to computational modeling are identified, and mapped to appropriate LP concepts for representing and reasoning about them. The main contributions of the thesis are twofold. First, novel approaches are proposed for employing tabling in contextual abduction and updating – individually and combined – plus a LP approach of counterfactual reasoning; the latter being implemented on top of the aforementioned combined abduction and updating technique with tabling. They are all important to model various issues of the aforementioned moral facets. Second, a variety of LP-based reasoning features are applied to model the identified moral facets, through moral examples taken off-the-shelf from the morality literature. These applications include: (1) Modeling moral permissibility according to the Doctrines of Double Effect (DDE) and Triple Effect (DTE), demonstrating deontological and utilitarian judgments via integrity constraints (in abduction) and preferences over abductive scenarios; (2) Modeling moral reasoning under uncertainty of actions, via abduction and probabilistic LP; (3) Modeling moral updating (that allows other – possibly overriding – moral rules to be adopted by an agent, on top of those it currently follows) via the integration of tabling in contextual abduction and updating; and (4) Modeling moral permissibility and its justification via counterfactuals, where counterfactuals are used for formulating DDE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Equity research report

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Financial crisis have happened in the past and will continue to do so in the future. In the most recent 2008 crisis, global equities (as measured by the MSCI ACWI index) lost a staggering 54.2% in USD, on the year. During those periods wealth preservation becomes at the top of most investor’s concerns. The purpose of this paper is to develop a strategy that protects the investment during bear markets and significant market corrections, generates capital appreciation, and that can support Millennium BCP’s Wealth Management Unit on their asset allocation procedures. This strategy extends the Dual Momentum approach introduced by Gary Antonacci (2014) in two ways. First, the investable set of securities in the equities space increases from two to four. Besides the US it will comprise the Japanese, European (excl. UK) and EM equity indices. Secondly, it adds a volatility filter as well as three indicators related to the business cycle and the state of the economy, which are relevant to decide on the strategy’s exposure to equities. Overall the results attest the resiliency of the strategy before, during and after historical financial crashes, as it drastically reduces the downside exposure and consistently outperforms the benchmark index by providing higher mean returns with lower variance.