973 resultados para direct-subtracting method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adhesive bonding has become more efficient in the last few decades due to the adhesives developments, granting higher strength and ductility. On the other hand, natural fibre composites have recently gained interest due to the low cost and density. It is therefore essential to predict the fracture behavior of joints between these materials, to assess the feasibility of joining or repairing with adhesives. In this work, the tensile fracture toughness (Gc n) of adhesive joints between natural fibre composites is studied, by bonding with a ductile adhesive and co-curing. Conventional methods to obtain Gc n are used for the co-cured specimens, while for the adhesive within the bonded joint, the J-integral is considered. For the J-integral calculation, an optical measurement method is developed for the evaluation of the crack tip opening and adherends rotation at the crack tip during the test, supported by a Matlab sub-routine for the automated extraction of these quantities. As output of this work, an optical method that allows an easier and quicker extraction of the parameters to obtain Gc n than the available methods is proposed (by the J-integral technique), and the fracture behaviour in tension of bonded and co-cured joints in jute-reinforced natural fibre composites is also provided for the subsequent strength prediction. Additionally, for the adhesively- bonded joints, the tensile cohesive law of the adhesive is derived by the direct method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Component joining is typically performed by welding, fastening, or adhesive-bonding. For bonded aerospace applications, adhesives must withstand high-temperatures (200°C or above, depending on the application), which implies their mechanical characterization under identical conditions. The extended finite element method (XFEM) is an enhancement of the finite element method (FEM) that can be used for the strength prediction of bonded structures. This work proposes and validates damage laws for a thin layer of an epoxy adhesive at room temperature (RT), 100, 150, and 200°C using the XFEM. The fracture toughness (G Ic ) and maximum load ( ); in pure tensile loading were defined by testing double-cantilever beam (DCB) and bulk tensile specimens, respectively, which permitted building the damage laws for each temperature. The bulk test results revealed that decreased gradually with the temperature. On the other hand, the value of G Ic of the adhesive, extracted from the DCB data, was shown to be relatively insensitive to temperature up to the glass transition temperature (T g ), while above T g (at 200°C) a great reduction took place. The output of the DCB numerical simulations for the various temperatures showed a good agreement with the experimental results, which validated the obtained data for strength prediction of bonded joints in tension. By the obtained results, the XFEM proved to be an alternative for the accurate strength prediction of bonded structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coronary artery disease (CAD) is currently one of the most prevalent diseases in the world population and calcium deposits in coronary arteries are one direct risk factor. These can be assessed by the calcium score (CS) application, available via a computed tomography (CT) scan, which gives an accurate indication of the development of the disease. However, the ionising radiation applied to patients is high. This study aimed to optimise the protocol acquisition in order to reduce the radiation dose and explain the flow of procedures to quantify CAD. The main differences in the clinical results, when automated or semiautomated post-processing is used, will be shown, and the epidemiology, imaging, risk factors and prognosis of the disease described. The software steps and the values that allow the risk of developingCADto be predicted will be presented. A64-row multidetector CT scan with dual source and two phantoms (pig hearts) were used to demonstrate the advantages and disadvantages of the Agatston method. The tube energy was balanced. Two measurements were obtained in each of the three experimental protocols (64, 128, 256 mAs). Considerable changes appeared between the values of CS relating to the protocol variation. The predefined standard protocol provided the lowest dose of radiation (0.43 mGy). This study found that the variation in the radiation dose between protocols, taking into consideration the dose control systems attached to the CT equipment and image quality, was not sufficient to justify changing the default protocol provided by the manufacturer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

I (Prática pedagógica)- Esta secção do Relatório de Estágio pretende apresentar elementos referentes ao Estágio do Ensino Especializado da Música no ensino do saxofone, efectuado na Escola de Música Luís António Maldonado Rodrigues, no ano lectivo 2012/2013. Neste estágio foram envolvidos e analisados três alunos, em níveis distintos de desenvolvimento, mas com orientações semelhantes no que respeita à organização do trabalho. Para cada aluno foram realizados trinta planos de aula, uma planificação anual e três gravações vídeo/áudio em contexto de sala de aula, permitindo uma análise e reflexão mais profunda do trabalho docente. A secção é composta pela caracterização da escola onde se realizou o estágio, através da sua contextualização/funcionamento, dos seus espaços e equipamentos, recursos humanos existentes e organização pedagógica. Posteriormente é efectuada a caracterização dos três alunos envolvidos no estágio, baseada na experiência docente e nos conhecimentos fornecidos pelas Unidades Curriculares do Mestrado em Ensino da Música. Seguidamente descrevem-se as práticas lectivas desenvolvidas ao longo do ano lectivo por parte do docente, incorporando linhas orientadoras da docência aplicadas na prática pedagógica. É feita uma análise crítica da actividade docente no âmbito do estágio do Ensino Especializado da Música, e, por último, uma conclusão desta primeira secção. .

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global warming and the associated climate changes are being the subject of intensive research due to their major impact on social, economic and health aspects of the human life. Surface temperature time-series characterise Earth as a slow dynamics spatiotemporal system, evidencing long memory behaviour, typical of fractional order systems. Such phenomena are difficult to model and analyse, demanding for alternative approaches. This paper studies the complex correlations between global temperature time-series using the Multidimensional scaling (MDS) approach. MDS provides a graphical representation of the pattern of climatic similarities between regions around the globe. The similarities are quantified through two mathematical indices that correlate the monthly average temperatures observed in meteorological stations, over a given period of time. Furthermore, time dynamics is analysed by performing the MDS analysis over slices sampling the time series. MDS generates maps describing the stations’ locus in the perspective that, if they are perceived to be similar to each other, then they are placed on the map forming clusters. We show that MDS provides an intuitive and useful visual representation of the complex relationships that are present among temperature time-series, which are not perceived on traditional geographic maps. Moreover, MDS avoids sensitivity to the irregular distribution density of the meteorological stations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To analyze the direct medical costs of HIV/AIDS in Portugal from the perspective of the National Health Service. METHODS A retrospective analysis of medical records was conducted for 150 patients from five specialized centers in Portugal in 2008. Data on utilization of medical resources during 12 months and patients’ characteristics were collected. A unit cost was applied to each care component using official sources and accounting data from National Health Service hospitals. RESULTS The average cost of treatment was 14,277 €/patient/year. The main cost-driver was antiretroviral treatment (€ 9,598), followed by hospitalization costs (€ 1,323). Treatment costs increased with the severity of disease from € 11,901 (> 500 CD4 cells/µl) to € 23,351 (CD4 count ≤ 50 cells/ µl). Cost progression was mainly due to the increase in hospitalization costs, while antiretroviral treatment costs remained stable over disease stages. CONCLUSIONS The high burden related to antiretroviral treatment is counterbalanced by relatively low hospitalization costs, which, however, increase with severity of disease. The relatively modest progression of total costs highlights that alternative public health strategies that do not affect transmission of disease may only have a limited impact on expenditure, since treatment costs are largely dominated by constant antiretroviral treatment costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dragonflies show unique and superior flight performances than most of other insect species and birds. They are equipped with two pairs of independently controlled wings granting an unmatchable flying performance and robustness. In this paper, it is presented an adaptive scheme controlling a nonlinear model inspired in a dragonfly-like robot. It is proposed a hybrid adaptive (HA) law for adjusting the parameters analyzing the tracking error. At the current stage of the project it is considered essential the development of computational simulation models based in the dynamics to test whether strategies or algorithms of control, parts of the system (such as different wing configurations, tail) as well as the complete system. The performance analysis proves the superiority of the HA law over the direct adaptive (DA) method in terms of faster and improved tracking and parameter convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introdução – A estimativa da função renal relativa (FRR) através de cintigrafia renal (CR) com ácido dimercaptossuccínico marcado com tecnécio-99 metaestável (99mTc-DMSA) pode ser influenciada pela profundidade renal (PR), atendendo ao efeito de atenuação por parte dos tecidos moles que envolvem os rins. Dado que raramente é conhecida esta mesma PR, diferentes métodos de correção de atenuação (CA) foram desenvolvidos, nomeadamente os que utilizam fórmulas empíricas, como os de Raynaud, de Taylor ou de Tonnesen, ou recorrendo à aplicação direta da média geométrica (MG). Objetivos – Identificar a influência dos diferentes métodos de CA na quantificação da função renal relativa através da CR com 99mTc-DMSA e avaliar a respetiva variabilidade dos resultados de PR. Metodologia – Trinta e um pacientes com indicação para realização de CR com 99mTc-DMSA foram submetidos ao mesmo protocolo de aquisição. O processamento foi efetuado por dois operadores independentes, três vezes por exame, variando para o mesmo processamento o método de determinação da FRR: Raynaud, Taylor, Tonnesen, MG ou sem correção de atenuação (SCA). Aplicou-se o teste de Friedman para o estudo da influência dos diferentes métodos de CA e a correlação de Pearson para a associação e significância dos valores de PR com as variáveis idade, peso e altura. Resultados – Da aplicação do teste de Friedman verificaram-se diferenças estatisticamente significativas entre os vários métodos (p=0,000), excetuando as comparações SCA/Raynaud, Tonnesen/MG e Taylor/MG (p=1,000) para ambos os rins. A correlação de Pearson demonstra que a variável peso apresenta uma correlação forte positiva com todos os métodos de cálculo da PR. Conclusões – O método de Taylor, entre os três métodos de cálculo de PR, é o que apresenta valores de FRR mais próximos da MG. A escolha do método de CA influencia significativamente os parâmetros quantitativos de FRR.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A eritropoietina (EPO) é uma substância que estimula a produção de eritrócitos, aumentando a oxigenação muscular, sendo segregada de forma natural pelo organismo e excretada na urina em baixas concentrações. Devido às suas propriedades e características, a EPO foi rapidamente introduzida no mundo do desporto, como substância ilícita, proporcionando vantagens no rendimento desportivo. No início de 2000 foi desenvolvido um método de deteção direta de EPO Recombinante (rHuEPO) em urina humana por Lasne, baseado na focalização isoelétrica (IEF) em gel de poliacrilamida, seguido de duplo blote, tendo este sido publicado e validado. Em 2002, a Agência Mundial Antidopagem (AMA) implementou este mesmo método, sendo atualmente um dos métodos oficiais utilizado pelos laboratórios acreditados pela AMA. Desta forma, o ponto de partida para a realização deste trabalho consistiu na necessidade de implementar e validar o método de referência de IEF para a deteção de rHuEPO em urina humana. O trabalho foi realizado no Laboratório de Análises e Dopagem (LAD) do Instituto do Desporto de Portugal (IDP), atual Instituto Português do Desporto e Juventude (IPDJ). O principal objetivo deste trabalho consistiu no estudo/investigação de diferentes parâmetros de validação (especificidade/seletividade; capacidade de identificação; limite de deteção; exatidão e repetibilidade), de acordo com o protocolado no Procedimento Geral interno do Laboratório de Análises de Dopagem de Lisboa (LAD). O referido método de triagem e confirmação revelou possuir características de desempenho conformes com os requisitos aplicáveis, pelo que é considerado validado e apto.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Eur. J. Biochem. 271, 1329–1338 (2004)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To analyze cervical and breast cancer mortality in Brazil according to socioeconomic and welfare indicators. METHODS Data on breast and cervical cancer mortality covering a 30-year period (1980-2010) were analyzed. The data were obtained from the National Mortality Database, population data from the Brazilian Institute of Geography and Statistics database, and socioeconomic and welfare information from the Institute of Applied Economic Research. Moving averages were calculated, disaggregated by capital city and municipality. The annual percent change in mortality rates was estimated by segmented linear regression using the joinpoint method. Pearson’s correlation coefficients were conducted between average mortality rate at the end of the three-year period and selected indicators in the state capital and each Brazilian state. RESULTS There was a decline in cervical cancer mortality rates throughout the period studied, except in municipalities outside of the capitals in the North and Northeast. There was a decrease in breast cancer mortality in the capitals from the end of the 1990s onwards. Favorable socioeconomic indicators were inversely correlated with cervical cancer mortality. A strong direct correlation was found with favorable indicators and an inverse correlation with fertility rate and breast cancer mortality in inner cities. CONCLUSIONS There is an ongoing dynamic process of increased risk of cervical and breast cancer and attenuation of mortality because of increased, albeit unequal, access to and provision of screening, diagnosis and treatment. 

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE To propose a method of redistributing ill-defined causes of death (IDCD) based on the investigation of such causes.METHODS In 2010, an evaluation of the results of investigating the causes of death classified as IDCD in accordance with chapter 18 of the International Classification of Diseases (ICD-10) by the Mortality Information System was performed. The redistribution coefficients were calculated according to the proportional distribution of ill-defined causes reclassified after investigation in any chapter of the ICD-10, except for chapter 18, and used to redistribute the ill-defined causes not investigated and remaining by sex and age. The IDCD redistribution coefficient was compared with two usual methods of redistribution: a) Total redistribution coefficient, based on the proportional distribution of all the defined causes originally notified and b) Non-external redistribution coefficient, similar to the previous, but excluding external causes.RESULTS Of the 97,314 deaths by ill-defined causes reported in 2010, 30.3% were investigated, and 65.5% of those were reclassified as defined causes after the investigation. Endocrine diseases, mental disorders, and maternal causes had a higher representation among the reclassified ill-defined causes, contrary to infectious diseases, neoplasms, and genitourinary diseases, with higher proportions among the defined causes reported. External causes represented 9.3% of the ill-defined causes reclassified. The correction of mortality rates by the total redistribution coefficient and non-external redistribution coefficient increased the magnitude of the rates by a relatively similar factor for most causes, contrary to the IDCD redistribution coefficient that corrected the different causes of death with differentiated weights.CONCLUSIONS The proportional distribution of causes among the ill-defined causes reclassified after investigation was not similar to the original distribution of defined causes. Therefore, the redistribution of the remaining ill-defined causes based on the investigation allows for more appropriate estimates of the mortality risk due to specific causes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimization problems arise in science, engineering, economy, etc. and we need to find the best solutions for each reality. The methods used to solve these problems depend on several factors, including the amount and type of accessible information, the available algorithms for solving them, and, obviously, the intrinsic characteristics of the problem. There are many kinds of optimization problems and, consequently, many kinds of methods to solve them. When the involved functions are nonlinear and their derivatives are not known or are very difficult to calculate, these methods are more rare. These kinds of functions are frequently called black box functions. To solve such problems without constraints (unconstrained optimization), we can use direct search methods. These methods do not require any derivatives or approximations of them. But when the problem has constraints (nonlinear programming problems) and, additionally, the constraint functions are black box functions, it is much more difficult to find the most appropriate method. Penalty methods can then be used. They transform the original problem into a sequence of other problems, derived from the initial, all without constraints. Then this sequence of problems (without constraints) can be solved using the methods available for unconstrained optimization. In this chapter, we present a classification of some of the existing penalty methods and describe some of their assumptions and limitations. These methods allow the solving of optimization problems with continuous, discrete, and mixing constraints, without requiring continuity, differentiability, or convexity. Thus, penalty methods can be used as the first step in the resolution of constrained problems, by means of methods that typically are used by unconstrained problems. We also discuss a new class of penalty methods for nonlinear optimization, which adjust the penalty parameter dynamically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Search Optimization methods are needed to solve optimization problems where the objective function and/or constraints functions might be non differentiable, non convex or might not be possible to determine its analytical expressions either due to its complexity or its cost (monetary, computational, time,...). Many optimization problems in engineering and other fields have these characteristics, because functions values can result from experimental or simulation processes, can be modelled by functions with complex expressions or by noise functions and it is impossible or very difficult to calculate their derivatives. Direct Search Optimization methods only use function values and do not need any derivatives or approximations of them. In this work we present a Java API that including several methods and algorithms, that do not use derivatives, to solve constrained and unconstrained optimization problems. Traditional API access, by installing it on the developer and/or user computer, and remote API access to it, using Web Services, are also presented. Remote access to the API has the advantage of always allow the access to the latest version of the API. For users that simply want to have a tool to solve Nonlinear Optimization Problems and do not want to integrate these methods in applications, also two applications were developed. One is a standalone Java application and the other a Web-based application, both using the developed API.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple procedure to measure the cohesive laws of bonded joints under mode I loading using the double cantilever beam test is proposed. The method only requires recording the applied load–displacement data and measuring the crack opening displacement at its tip in the course of the experimental test. The strain energy release rate is obtained by a procedure involving the Timoshenko beam theory, the specimen’s compliance and the crack equivalent concept. Following the proposed approach the influence of the fracture process zone is taken into account which is fundamental for an accurate estimation of the failure process details. The cohesive law is obtained by differentiation of the strain energy release rate as a function of the crack opening displacement. The model was validated numerically considering three representative cohesive laws. Numerical simulations using finite element analysis including cohesive zone modeling were performed. The good agreement between the inputted and resulting laws for all the cases considered validates the model. An experimental confirmation was also performed by comparing the numerical and experimental load–displacement curves. The numerical load–displacement curves were obtained by adjusting typical cohesive laws to the ones measured experimentally following the proposed approach and using finite element analysis including cohesive zone modeling. Once again, good agreement was obtained in the comparisons thus demonstrating the good performance of the proposed methodology.