876 resultados para Achievable Benchmarks
Resumo:
The Intel R Xeon PhiTM is the first processor based on Intel’s MIC (Many Integrated Cores) architecture. It is a co-processor specially tailored for data-parallel computations, whose basic architectural design is similar to the ones of GPUs (Graphics Processing Units), leveraging the use of many integrated low computational cores to perform parallel computations. The main novelty of the MIC architecture, relatively to GPUs, is its compatibility with the Intel x86 architecture. This enables the use of many of the tools commonly available for the parallel programming of x86-based architectures, which may lead to a smaller learning curve. However, programming the Xeon Phi still entails aspects intrinsic to accelerator-based computing, in general, and to the MIC architecture, in particular. In this thesis we advocate the use of algorithmic skeletons for programming the Xeon Phi. Algorithmic skeletons abstract the complexity inherent to parallel programming, hiding details such as resource management, parallel decomposition, inter-execution flow communication, thus removing these concerns from the programmer’s mind. In this context, the goal of the thesis is to lay the foundations for the development of a simple but powerful and efficient skeleton framework for the programming of the Xeon Phi processor. For this purpose we build upon Marrow, an existing framework for the orchestration of OpenCLTM computations in multi-GPU and CPU environments. We extend Marrow to execute both OpenCL and C++ parallel computations on the Xeon Phi. We evaluate the newly developed framework, several well-known benchmarks, like Saxpy and N-Body, will be used to compare, not only its performance to the existing framework when executing on the co-processor, but also to assess the performance on the Xeon Phi versus a multi-GPU environment.
Resumo:
Viral vectors are playing an increasingly important role in the vaccine and gene therapy elds. The broad spectrum of potential applications, together with expanding medical markets, drives the e orts to improve the production processes for viral vaccines and viral vectors. Developing countries, in particular, are becoming the main vaccine market. It is thus critical to decrease the cost per dose, which is only achievable by improving the production process. In particular advances in the upstream processing have substantially increased bioreactor yields, shifting the bioprocess bottlenecks towards the downstream processing. The work presented in this thesis aimed to develop new processes for adenoviruses puri cation. The use of state-of-the-art technology combined with innovative continuous processes contributed to build robust and cost-e ective strategies for puri cation of complex biopharmaceuticals.(...)
Resumo:
OutSystems Platform is used to develop, deploy, and maintain enterprise web an mobile web applications. Applications are developed through a visual domain specific language, in an integrated development environment, and compiled to a standard stack of web technologies. In the platform’s core, there is a compiler and a deployment service that transform the visual model into a running web application. As applications grow, compilation and deployment times increase as well, impacting the developer’s productivity. In the previous model, a full application was the only compilation and deployment unit. When the developer published an application, even if he only changed a very small aspect of it, the application would be fully compiled and deployed. Our goal is to reduce compilation and deployment times for the most common use case, in which the developer performs small changes to an application before compiling and deploying it. We modified the OutSystems Platform to support a new incremental compilation and deployment model that reuses previous computations as much as possible in order to improve performance. In our approach, the full application is broken down into smaller compilation and deployment units, increasing what can be cached and reused. We also observed that this finer model would benefit from a parallel execution model. Hereby, we created a task driven Scheduler that executes compilation and deployment tasks in parallel. Our benchmarks show a substantial improvement of the compilation and deployment process times for the aforementioned development scenario.
Resumo:
This paper studies the effects of monetary policy on mutual fund risk taking using a sample of Portuguese fixed-income mutual funds in the 2000-2012 period. Firstly I estimate time-varying measures of risk exposure (betas) for the individual funds, for the benchmark portfolio, as well as for a representative equally-weighted portfolio, through 24-month rolling regressions of a two-factor model with two systematic risk factors: interest rate risk (TERM) and default risk (DEF). Next, in the second phase, using the estimated betas, I try to understand what portion of the risk exposure is in excess of the benchmark (active risk) and how it relates to monetary policy proxies (one-month rate, Taylor residual, real rate and first principal component of a cross-section of government yields and rates). Using this methodology, I provide empirical evidence that Portuguese fixed-income mutual funds respond to accommodative monetary policy by significantly increasing exposure, in excess of their benchmarks, to default risk rate and slightly to interest risk rate as well. I also find that the increase in funds’ risk exposure to gain a boost in return (search-for-yield) is more pronounced following the 2007-2009 global financial crisis, indicating that the current historic low interest rates may incentivize excessive risk taking. My results suggest that monetary policy affects the risk appetite of non-bank financial intermediaries.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics and Maastricht University School of Business and Economics
Resumo:
The need for more efficient illumination systems has led to the proliferation of Solid-State Lighting (SSL) systems, which offer optimized power consumption. SSL systems are comprised of LED devices which are intrinsically fast devices and permit very fast light modulation. This, along with the congestion of the radio frequency spectrum has paved the path for the emergence of Visible Light Communication (VLC) systems. VLC uses free space to convey information by using light modulation. Notwithstanding, as VLC systems proliferate and cost competitiveness ensues, there are two important aspects to be considered. State-of-the-art VLC implementations use power demanding PAs, and thus it is important to investigate if regular, existent Switched-Mode Power Supply (SMPS) circuits can be adapted for VLC use. A 28 W buck regulator was implemented using a off-the-shelf LED Driver integrated circuit, using both series and parallel dimming techniques. Results show that optical clock frequencies up to 500 kHz are achievable without any major modification besides adequate component sizing. The use of an LED as a sensor was investigated, in a short-range, low-data-rate perspective. Results show successful communication in an LED-to-LED configuration, with enhanced range when using LED strings as sensors. Besides, LEDs present spectral selective sensitivity, which makes them good contenders for a multi-colour LED-to-LED system, such as in the use of RGB displays and lamps. Ultimately, the present work shows evidence that LEDs can be used as a dual-purpose device, enabling not only illumination, but also bi-directional data communication.
Resumo:
The purpose of this work project was to analyze and evaluate the potential impact of a technological innovation in the telecommunications sector, across a wide range of business areas. A cost-benefit and competitive analysis for each pre-selected business area was conducted, as well as national and international benchmarks. As a result of the analysis, a list of prioritized business areas, presenting more immediate opportunities for Portugal Telecom, was created and implications for go-to-market strategies were inferred from the conclusions reached. In addition, a final recommendation that redefined the company’s positioning strategy was made
Resumo:
Evidence suggests that human semen quality may have been deteriorating in recent years. Most of the evidence is retrospective, based on analysis of data sets collected for other purposes. Measures of male infertility are needed if we want to monitor the biological capacity for males to reproduce over time or between different populations. We also need these measures in analytical epidemiology if we want to identify risk indicators, risk factors, or even causes of an impaired male fecundity-that is, the male component in the biological ability to reproduce. The most direct evaluation of fecundity is to measure the time it takes to conceive. Since the time of conception may be missed in the case of an early abortion, time to get pregnant is often measured as the time it takes to obtain a conception that survives until a clinically recognized pregnancy or even a pregnancy that ends with a live born child occurs. A prolonged time required to produce pregnancy may therefore be due to a failure to conceive or a failure to maintain a pregnancy until clinical recognition. Studies that focus on quantitative changes in fecundity (that does not cause sterility) should in principle be possible in a pregnancy sample. The most important limitation in fertility studies is that the design requires equal persistency in trying to become pregnant and rather similar fertility desires and family planning methods in the groups to be compared. This design is probably achievable in exposure studies that make comparisons with reasonable comparable groups concerning social conditions and use of contraceptive methods.
Resumo:
OBJECTIVE: To analyze surgical and pathological parameters and outcome and prognostic factors of patients with nonsmall cell lung cancer (NSCLC) who were admitted to a single institution, as well as to correlate these findings to the current staging system. METHOD: Seven hundred and thirty seven patients were diagnosed with NSCLC and admitted to Hospital do Cancer A. C. Camargo from 1990 to 2000. All patients were included in a continuous prospective database, and their data was analyzed. Following staging, a multidisciplinary team decision on adequate management was established. Variables included in this analysis were age, gender, histology, Karnofsky index, weight loss, clinical stage, surgical stage, chemotherapy, radiotherapy, and survival rates. RESULTS: 75.5% of patients were males. The distribution of histologic type was squamous cell carcinoma 51.8%, adenocarcinoma 43.1%, and undifferentiated large cell carcinoma 5.1%. Most patients (73%) presented significant weight loss and a Karnofsky index of 80%. Clinical staging was IA 3.8%, IB 9.2%, IIA 1.4%, IIB 8.1%, IIIA 20.9%, IIIB 22.4%, IV 30.9%. Complete tumor resection was performed in 24.6% of all patients. Surgical stage distribution was IA 25.3%, IB 1.4%, IIB 17.1%, IIIA 16.1%, IIIB 20.3%, IV 11.5%. Chemotherapy and radiotherapy were considered therapeutic options in 43% and 72%, respectively. The overall 5-year survival rate of nonsmall cell lung cancer patients in our study was 28%. Median survival was 18.9 months. CONCLUSIONS: Patients with NSCLC who were admitted to our institution presented with histopathologic and clinical characteristics that were similar to previously published series in cancer hospitals. The best prognosis was associated with complete tumor resection with lymph node dissection, which is only achievable in earlier clinical stages.
Resumo:
The following project introduces a model of Growth Hacking strategies for business-tobusiness Software-as-a-Service startups that was developed in collaboration with and applied to a Portuguese startup called Liquid. The work addresses digital marketing channels such as content marketing, email marketing, social marketing and selling. Further, the company’s product, pricing strategy, partnerships and website communication are examined. Applying best case practices, competitor benchmarks and interview insights from numerous industry influencers and experts, areas for improvement are deduced and procedures for each of those channels recommended.
Avaliação do desempenho de fundos de investimento de obrigações: evidência para o mercado Brasileiro
Resumo:
Dissertação de mestrado em Finanças
Resumo:
Electric Vehicles (EVs) have limited energy storage capacity and the maximum autonomy range is strongly dependent of the driver's behaviour. Due to the fact of that batteries cannot be recharged quickly during a journey, it is essential that a precise range prediction is available to the driver of the EV. With this information, it is possible to check if the desirable destination is achievable without a stop to charge the batteries, or even, if to reach the destination it is necessary to perform an optimized driving (e.g., cutting the air-conditioning, among others EV parameters). The outcome of this research work is the development of an Electric Vehicle Assistant (EVA). This is an application for mobile devices that will help users to take efficient decisions about route planning, charging management and energy efficiency. Therefore, it will contribute to foster EVs adoption as a new paradigm in the transportation sector.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão Industrial
Resumo:
Tese de Doutoramento Plano Doutoral em Engenharia Eletrónica e de Computadores.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil