970 resultados para Offer calculation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to compare accumulated oxygen deficit data derived using two different exercise protocols with the aim of producing a less time-consuming test specifically for use with athletes. Six road and four track male endurance cyclists performed two series of cycle ergometer tests. The first series involved five 10 min sub-maximal cycle exercise bouts, a (V) over dotO(2peak) test and a 115% (V) over dotO(2peak) test. Data from these tests were used to estimate the accumulated oxygen deficit according to the calculations of Medbo et al. (1988). In the second series of tests, participants performed a 15 min incremental cycle ergometer test followed, 2 min later, by a 2 min variable resistance test in which they completed as much work as possible while pedalling at a constant rate. Analysis revealed that the accumulated oxygen deficit calculated from the first series of tests was higher (P< 0.02) than that calculated from the second series: 52.3 +/- 11.7 and 43.9 +/- 6.4 ml . kg(-1), respectively (mean +/- s). Other significant differences between the two protocols were observed for (V) over dot O-2peak, total work and maximal heart rate; all were higher during the modified protocol (P

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As vigas mistas de aço e concreto estão sendo largamente utilizadas em construções de edifícios e pontes. Ao se combinar o aço com o concreto obtêm-se estruturas mais econômicas, uma vez que se tira proveito das melhores características de cada material. Nas regiões de momento negativo de uma viga mista contínua, a mesa inferior e parte da alma estão comprimidas, se a alma do perfil não tiver rigidez suficiente para evitar a flexão lateral, ela distorcerá gerando um deslocamento lateral e um giro na mesa comprimida, caracterizando um modo de flambagem denominado flambagem lateral com distorção (FLD). O procedimento de verificação à FLD da EN 1994-1-1:2004 originou o método de cálculo da ABNT NBR 8800:2008, entretanto a EN 1994-1-1:2004 não fornece expressão para o cálculo do momento crítico elástico, enquanto a ABNT NBR 8800:2008 prescreve uma formulação proposta por Roik, Hanswille e Kina (1990) desenvolvida para vigas mistas com perfis de alma plana. Embora as normas prescrevam um método de verificação à FLD para vigas mistas com perfis de alma plana, poucos estudos têm sido feitos sobre esse estado-limite. Além disso, tanto a ABNT NBR 8800:2008 quanto as normas internacionais não abordam perfis de alma senoidal. Neste trabalho, foram implementadas análises de flambagem elástica, com auxílio do software ANSYS 14.0 (2011), em modelos de elementos finitos que retratem o comportamento à FLD de vigas mistas de aço e concreto com perfis de alma plana e senoidal. Os modelos numéricos foram constituídos pelo perfil de aço, por uma mola rotacional que restringe parcialmente o giro da mesa superior e uma restrição ao deslocamento lateral, ao longo de todo o comprimento da viga. Os resultados numéricos são comparados com os obtidos pelas formulações de Roik, Hanswille e Kina (1990) e de Hanswille (2002), adaptadas para levar em consideração a corrugação da alma do perfil de aço. Para avaliação das formulações supracitadas e da consistência da modelagem numérica adotada, o momento crítico elástico foi determinado para vigas mistas com perfis de aço de alma plana. Como resultado, um método para o cálculo do momento crítico elástico de vigas mistas de alma senoidal é proposto.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually oriented towards the imperative or object paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird-Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general, alternative to slicing functional programs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Program slicing is a well known family of techniques used to identify code fragments which depend on or are depended upon specific program entities. They are particularly useful in the areas of reverse engineering, program understanding, testing and software maintenance. Most slicing methods, usually targeting either the imperative or the object oriented paradigms, are based on some sort of graph structure representing program dependencies. Slicing techniques amount, therefore, to (sophisticated) graph transversal algorithms. This paper proposes a completely different approach to the slicing problem for functional programs. Instead of extracting program information to build an underlying dependencies’ structure, we resort to standard program calculation strategies, based on the so-called Bird- Meertens formalism. The slicing criterion is specified either as a projection or a hiding function which, once composed with the original program, leads to the identification of the intended slice. Going through a number of examples, the paper suggests this approach may be an interesting, even if not completely general alternative to slicing functional programs

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main intend of this work, is to determinate the Specific Absorption Rate (SAR) on human head tissues exposed to radiation caused by sources of 900 and 1800MHz, since those are the typical frequencies for mobile communications systems nowadays. In order to determinate the SAR, has been used the FDTD (Finite Difference Time Domain), which is a numeric method in time domain, obtained from the Maxwell equations in differential mode. In order to do this, a computational model from the human head in two dimensions made with cells of the smallest possible size was implemented, respecting the limits from computational processing. It was possible to verify the very good efficiency of the FDTD method in the resolution of those types of problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology supported on the data base knowledge discovery process (KDD), in order to find out the failure probability of electrical equipments’, which belong to a real electrical high voltage network. Data Mining (DM) techniques are used to discover a set of outcome failure probability and, therefore, to extract knowledge concerning to the unavailability of the electrical equipments such us power transformers and high-voltages power lines. The framework includes several steps, following the analysis of the real data base, the pre-processing data, the application of DM algorithms, and finally, the interpretation of the discovered knowledge. To validate the proposed methodology, a case study which includes real databases is used. This data have a heavy uncertainty due to climate conditions for this reason it was used fuzzy logic to determine the set of the electrical components failure probabilities in order to reestablish the service. The results reflect an interesting potential of this approach and encourage further research on the topic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Medicina Nuclear - Área de especialização: Tomografia por Emissão de Positrões.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia de Electrónica e Telecomunicações

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper a new method for the calculation of the fractional expressions in the presence of sensor redundancy and noise, is presented. An algorithm, taking advantage of the signal characteristics and the sensor redundancy, is tuned and optimized through genetic algorithms. The results demonstrate the good performance for different types of expressions and distinct levels of noise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the calculation of derivatives of fractional order for non-smooth data. The noise is avoided by adopting an optimization formulation using genetic algorithms (GA). Given the flexibility of the evolutionary schemes, a hierarchical GA composed by a series of two GAs, each one with a distinct fitness function, is established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O planeamento da construção tem sido considerado como uma mera burocracia e utilizado essencialmente como uma orientação temporal do desenvolvimento da obra, que se reflete, frequentes vezes, na utilização inadequada das técnicas de planeamento. Com esta dissertação pretende-se oferecer uma nova perspectiva sobre as técnicas e softwares de planeamento e a sua melhor utilização. Contudo, focalizando-se mais nos métodos CPM e LOB, efetua-se o seu planeamento nos softwares Microsoft Project e CCS- Candy respetivamente. Inicia-se este trabalho com uma breve descrição acerca do estado da arte das metodologias e softwares de planeamento da construção. Seguidamente efetua-se a demonstração do planeamento de dois casos de estudo, sendo o primeiro caso de estudo referente ao planeamento da construção de uma ponte com recurso ao Microsoft Project. Realiza-se o cálculo dos rendimentos, dimensionamentos das equipas de trabalho e análise dos diagramas de planeamento fornecido pelo Microsoft Project. No segundo caso de estudo, demonstra-se o planeamento da estrutura de dois edifícios no software CCS - Candy. Efetua-se igualmente o cálculo dos rendimentos, dimensionamento das equipas de trabalho e análise, essencialmente, do Gráfico Espaço/Tempo (LOB). Após a realização do planeamento dos dois casos de estudo, realiza-se uma pequena comparação das duas metodologias, CPM e LOB, referenciando as vantagens e desvantagens da sua utilização no planeamento de obras, seguido de uma breve conclusão. Termina-se esta dissertação com a apresentação das conclusões gerais e das propostas para trabalhos futuros.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The calculation of fractional derivatives is an important topic in scientific research. While formal definitions are clear from the mathematical point of view, they pose limitations in applied sciences that have not been yet tackled. This paper addresses the problem of obtaining left and right side derivatives when adopting numerical approximations. The results reveal the relationship between the resulting distinct values for different fractional orders and types of signals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper purposes a method for marketing segmentation based on customers‟ lifestyle. A quantitative and qualitative segmentation established by the Whitaker Lifestyle™ Method was created in order to define a concrete and clear identification of the customer, by understanding the behavior, style and preferences of each segment. After conducting 18 in-depth interviews, it was concluded that four main personas characterize the customer base of the company. These four personas will be the support for the creation of „quick-wins‟ that address to the expectations of each lifestyle, projecting a significant impact on the lifetime-value of the company‟s customer base

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project is composed by a Case Study regarding Tagus’ takeover offer for Brisa. The case study describes each player involved in the operation: the target company – Brisa, the acquirers – José de Mello and Arcus; as well as the circumstances surrounding the takeover, with a description of the takeover itself and the conflict between the acquirers and Abertis. Associated to the case, there is a group of six question and their respective answers, regarding the motive of the takeover, the price per share, what should be the positions of Brisa’s shareholders regarding the takeover and the reason Brisa’s share price declined after the success of the operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.