580 resultados para Benchmarks
Resumo:
Observation-based slicing is a recently-introduced, language-independent, slicing technique based on the dependencies observable from program behaviour. Due to the wellknown limits of dynamic analysis, we may only compute an under-approximation of the true observation-based slice. However, because the observation-based slice captures all possible dependence that can be observed, even such approximations can yield insight into the limitations of static slicing. For example, a static slice, S that is strictly smaller than the corresponding observation based slice is guaranteed to be unsafe. We present the results of three sets of experiments on 12 different programs, including benchmarks and larger programs, which investigate the relationship between static and observation-based slicing. We show that, in extreme cases, observation-based slices can find the true static minimal slice, where static techniques cannot. For more typical cases, our results illustrate the potential for observation-based slicing to highlight unsafe static slices. Finally, we report on the sensitivity of observation-based slicing to test quality.
Resumo:
We present an improved, biologically inspired and multiscale keypoint operator. Models of single- and double-stopped hypercomplex cells in area V1 of the mammalian visual cortex are used to detect stable points of high complexity at multiple scales. Keypoints represent line and edge crossings, junctions and terminations at fine scales, and blobs at coarse scales. They are detected by applying first and second derivatives to responses of complex cells in combination with two inhibition schemes to suppress responses along lines and edges. A number of optimisations make our new algorithm much faster than previous biologically inspired models, achieving real-time performance on modern GPUs and competitive speeds on CPUs. In this paper we show that the keypoints exhibit state-of-the-art repeatability in standardised benchmarks, often yielding best-in-class performance. This makes them interesting both in biological models and as a useful detector in practice. We also show that keypoints can be used as a data selection step, significantly reducing the complexity in state-of-the-art object categorisation. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
Acoustic predictions of the recently developed TRACEO ray model, which accounts for bottom shear properties, are benchmarked against tank experimental data from the EPEE-1 and EPEE-2 (Elastic Parabolic Equation Experiment) experiments. Both experiments are representative of signal propagation in a Pekeris-like shallow-water waveguide over a non-flat isotropic elastic bottom, where significant interaction of the signal with the bottom can be expected. The benchmarks show, in particular, that the ray model can be as accurate as a parabolic approximation model benchmarked in similar conditions. The results of benchmarking are important, on one side, as a preliminary experimental validation of the model and, on the other side, demonstrates the reliability of the ray approach for seismo-acoustic applications. (C) 2012 Acoustical Society of America. [http://dx.doi.org/10.1121/1.4734236]
Resumo:
Tese de doutoramento, Belas-Artes (Design de Equipamento), Universidade de Lisboa, Faculdade de Belas-Artes, 2015
Resumo:
Orientadora: Doutora Anabela Mesquita Teixeira Sarmento
Resumo:
In embedded systems, the timing behaviour of the control mechanisms are sometimes of critical importance for the operational safety. These high criticality systems require strict compliance with the offline predicted task execution time. The execution of a task when subject to preemption may vary significantly in comparison to its non-preemptive execution. Hence, when preemptive scheduling is required to operate the workload, preemption delay estimation is of paramount importance. In this paper a preemption delay estimation method for floating non-preemptive scheduling policies is presented. This work builds on [1], extending the model and optimising it considerably. The preemption delay function is subject to a major tightness improvement, considering the WCET analysis context. Moreover more information is provided as well in the form of an extrinsic cache misses function, which enables the method to provide a solution in situations where the non-preemptive regions sizes are small. Finally experimental results from the implementation of the proposed solutions in Heptane are provided for real benchmarks which validate the significance of this work.
Resumo:
Sparse matrix-vector multiplication (SMVM) is a fundamental operation in many scientific and engineering applications. In many cases sparse matrices have thousands of rows and columns where most of the entries are zero, while non-zero data is spread over the matrix. This sparsity of data locality reduces the effectiveness of data cache in general-purpose processors quite reducing their performance efficiency when compared to what is achieved with dense matrix multiplication. In this paper, we propose a parallel processing solution for SMVM in a many-core architecture. The architecture is tested with known benchmarks using a ZYNQ-7020 FPGA. The architecture is scalable in the number of core elements and limited only by the available memory bandwidth. It achieves performance efficiencies up to almost 70% and better performances than previous FPGA designs.
Resumo:
Objective To examine the combined effects of physical activity and weight status on blood pressure (BP) in preschool-aged children. Study design The sample included 733 preschool-aged children (49% female). Physical activity was objectively assessed on 7 consecutive days by accelerometry. Children were categorized as sufficiently active if they met the recommendation of at least 60 minutes daily of moderate-to-vigorous physical activity (MVPA). Body mass index was used to categorize children as nonoverweight or overweight/obese, according to the International Obesity Task Force benchmarks. BP was measured using an automated BP monitor and categorized as elevated or normal using BP percentile-based cut-points for age, sex, and height. Results The prevalence of elevated systolic BP (SBP) and diastolic BP was 7.7% and 3.0%, respectively. The prevalence of overweight/obese was 32%, and about 15% of children did not accomplish the recommended 60 minutes of daily MVPA. After controlling for age and sex, overweight/obese children who did not meet the daily MVPA recommendation were 3 times more likely (OR 3.8; CI 1.6-8.6) to have elevated SBP than nonoverweight children who met the daily MVPA recommendation. Conclusions Overweight or obese preschool-aged children with insufficient levels of MVPA are at significantly greater risk for elevated SBP than their non overweight and sufficiently active counterparts. (J Pediatr 2015;167:98-102).
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Management from the NOVA – School of Business and Economics
Resumo:
This paper studies the effects of monetary policy on mutual fund risk taking using a sample of Portuguese fixed-income mutual funds in the 2000-2012 period. Firstly I estimate time-varying measures of risk exposure (betas) for the individual funds, for the benchmark portfolio, as well as for a representative equally-weighted portfolio, through 24-month rolling regressions of a two-factor model with two systematic risk factors: interest rate risk (TERM) and default risk (DEF). Next, in the second phase, using the estimated betas, I try to understand what portion of the risk exposure is in excess of the benchmark (active risk) and how it relates to monetary policy proxies (one-month rate, Taylor residual, real rate and first principal component of a cross-section of government yields and rates). Using this methodology, I provide empirical evidence that Portuguese fixed-income mutual funds respond to accommodative monetary policy by significantly increasing exposure, in excess of their benchmarks, to default risk rate and slightly to interest risk rate as well. I also find that the increase in funds’ risk exposure to gain a boost in return (search-for-yield) is more pronounced following the 2007-2009 global financial crisis, indicating that the current historic low interest rates may incentivize excessive risk taking. My results suggest that monetary policy affects the risk appetite of non-bank financial intermediaries.
Resumo:
A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics and Maastricht University School of Business and Economics
Resumo:
The purpose of this work project was to analyze and evaluate the potential impact of a technological innovation in the telecommunications sector, across a wide range of business areas. A cost-benefit and competitive analysis for each pre-selected business area was conducted, as well as national and international benchmarks. As a result of the analysis, a list of prioritized business areas, presenting more immediate opportunities for Portugal Telecom, was created and implications for go-to-market strategies were inferred from the conclusions reached. In addition, a final recommendation that redefined the company’s positioning strategy was made
Resumo:
The following project introduces a model of Growth Hacking strategies for business-tobusiness Software-as-a-Service startups that was developed in collaboration with and applied to a Portuguese startup called Liquid. The work addresses digital marketing channels such as content marketing, email marketing, social marketing and selling. Further, the company’s product, pricing strategy, partnerships and website communication are examined. Applying best case practices, competitor benchmarks and interview insights from numerous industry influencers and experts, areas for improvement are deduced and procedures for each of those channels recommended.
Resumo:
Assays that measure a patient's immune response play an increasingly important role in the development of immunotherapies. The inherent complexity of these assays and independent protocol development between laboratories result in high data variability and poor reproducibility. Quality control through harmonization--based on integration of laboratory-specific protocols with standard operating procedures and assay performance benchmarks--is one way to overcome these limitations. Harmonization guidelines can be widely implemented to address assay performance variables. This process enables objective interpretation and comparison of data across clinical trial sites and also facilitates the identification of relevant immune biomarkers, guiding the development of new therapies.