966 resultados para irreducibility criterion


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work reports on an experimental and finite element method (FEM) parametric study of adhesively-bonded single and double-strap repairs on carbon-epoxy structures under buckling unrestrained compression. The influence of the overlap length and patch thickness was evaluated. This loading gains a particular significance from the additional characteristic mechanisms of structures under compression, such as fibres microbuckling, for buckling restrained structures, or global buckling of the assembly, if no transverse restriction exists. The FEM analysis is based on the use of cohesive elements including mixed-mode criteria to simulate a cohesive fracture of the adhesive layer. Trapezoidal laws in pure modes I and II were used to account for the ductility of most structural adhesives. These laws were estimated for the adhesive used from double cantilever beam (DCB) and end-notched flexure (ENF) tests, respectively, using an inverse technique. The pure mode III cohesive law was equalled to the pure mode II one. Compression failure in the laminates was predicted using a stress-based criterion. The accurate FEM predictions open a good prospect for the reduction of the extensive experimentation in the design of carbon-epoxy repairs. Design principles were also established for these repairs under buckling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To validate a screening instrument using self-reported assessment of frailty syndrome in older adults.METHODS This cross-sectional study used data from the Saúde, Bem-estar e Envelhecimento study conducted in Sao Paulo, SP, Southeastern Brazil. The sample consisted of 433 older adult individuals (≥ 75 years) assessed in 2009. The self-reported instrument can be applied to older adults or their proxy respondents and consists of dichotomous questions directly related to each component of the frailty phenotype, which is considered the gold standard model: unintentional weight loss, fatigue, low physical activity, decreased physical strength, and decreased walking speed. The same classification proposed in the phenotype was utilized: not frail (no component identified); pre-frail (presence of one or two components), and frail (presence of three or more components). Because this is a screening instrument, “process of frailty” was included as a category (pre-frail and frail). Cronbach’s α was used in psychometric analysis to evaluate the reliability and validity of the criterion, the sensitivity, the specificity, as well as positive and negative predictive values. Factor analysis was used to assess the suitability of the proposed number of components.RESULTS Decreased walking speed and decreased physical strength showed good internal consistency (α = 0.77 and 0.72, respectively); however, low physical activity was less satisfactory (α = 0.63). The sensitivity and specificity for identifying pre-frail individuals were 89.7% and 24.3%, respectively, while those for identifying frail individuals were 63.2% and 71.6%, respectively. In addition, 89.7% of the individuals from both the evaluations were identified in the “process of frailty” category.CONCLUSIONS The self-reported assessment of frailty can identify the syndrome among older adults and can be used as a screening tool. Its advantages include simplicity, rapidity, low cost, and ability to be used by different professionals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The trajectory planning of redundant robots is an important area of research and efficient optimization algorithms are needed. The pseudoinverse control is not repeatable, causing drift in joint space which is undesirable for physical control. This paper presents a new technique that combines the closed-loop pseudoinverse method with genetic algorithms, leading to an optimization criterion for repeatable control of redundant manipulators, and avoiding the joint angle drift problem. Computer simulations performed based on redundant and hyper-redundant planar manipulators show that, when the end-effector traces a closed path in the workspace, the robot returns to its initial configuration. The solution is repeatable for a workspace with and without obstacles in the sense that, after executing several cycles, the initial and final states of the manipulator are very close.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In cluster analysis, it can be useful to interpret the partition built from the data in the light of external categorical variables which are not directly involved to cluster the data. An approach is proposed in the model-based clustering context to select a number of clusters which both fits the data well and takes advantage of the potential illustrative ability of the external variables. This approach makes use of the integrated joint likelihood of the data and the partitions at hand, namely the model-based partition and the partitions associated to the external variables. It is noteworthy that each mixture model is fitted by the maximum likelihood methodology to the data, excluding the external variables which are used to select a relevant mixture model only. Numerical experiments illustrate the promising behaviour of the derived criterion. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

European Transactions on Telecommunications, vol. 18

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A procura de padrões nos dados de modo a formar grupos é conhecida como aglomeração de dados ou clustering, sendo uma das tarefas mais realizadas em mineração de dados e reconhecimento de padrões. Nesta dissertação é abordado o conceito de entropia e são usados algoritmos com critérios entrópicos para fazer clustering em dados biomédicos. O uso da entropia para efetuar clustering é relativamente recente e surge numa tentativa da utilização da capacidade que a entropia possui de extrair da distribuição dos dados informação de ordem superior, para usá-la como o critério na formação de grupos (clusters) ou então para complementar/melhorar algoritmos existentes, numa busca de obtenção de melhores resultados. Alguns trabalhos envolvendo o uso de algoritmos baseados em critérios entrópicos demonstraram resultados positivos na análise de dados reais. Neste trabalho, exploraram-se alguns algoritmos baseados em critérios entrópicos e a sua aplicabilidade a dados biomédicos, numa tentativa de avaliar a adequação destes algoritmos a este tipo de dados. Os resultados dos algoritmos testados são comparados com os obtidos por outros algoritmos mais “convencionais" como o k-médias, os algoritmos de spectral clustering e um algoritmo baseado em densidade.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the performance of integer and fractional order controllers in a hexapod robot with joints at the legs having viscous friction and flexibility. For that objective the robot prescribed motion is characterized in terms of several locomotion variables. The controller performance is analised through the Nyquist stability criterion. A set of model-based experiments reveals the influence of the different controller implementations upon the proposed metrics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Para uma melhor avaliação e definição do plano de intervenção do indivíduo, é cada vez mais importante a existência instrumentos de avaliação válidos e fiáveis para a população portuguesa. Objetivo: Traduzir e adaptar para a população Portuguesa a escala Trunk Impairment Scale (TIS) em pacientes pós-AVE, e avaliar as propriedades psicométricas da mesma. Metodologia: A TIS foi traduzida para o Português e adaptada culturalmente para a população portuguesa. As propriedades psicométricas da mesma, incluindo validade, fiabilidade, concordância inter-observadores, consistência interna, sensibilidade, especificidade, poder de resposta, foram avaliadas numa população diagnosticada com AVE e num grupo de controlo de participantes saudáveis. Participaram neste estudo 80 indivíduos, divididos em dois grupos, nomeadamente indivíduos pós-AVE (40) e um grupo sem patologia (40). Os participantes foram submetidos à aplicação das escalas de Berg, Medida de Independência Funcional e Escala de Desempenho Físico Fugl Meyer e a TIS de modo a avaliar as propriedades psicométricas desta. As avaliações foram realizadas por duas fisioterapeutas experientes e o re-teste foi realizado após 48 horas. Os dados foram registados e trabalhados com o programa informático SPSS 21.0. Resultados: Relativamente aos valores obtidos, verificou-se que, quanto à consistência interna da TIS estes apresentam-se de forma moderada a elevada (alfa Cronbach = 0,909). Quanto à fiabilidade inter-observadores, os itens com menor valor são os itens 1 e 4 (0,759 e 0,527, respetivamente) e os itens com valor de Kappa mais alto são os itens 5 e 6 (0,830 e 0,893, respetivamente). Relativamente à validade de critério, verificou-se que não houve correlação entre a escala de Desempenho Físico Fugl-Meyer, a escala de Equilibrio de Berg e a Medida de Independência Funcional, ou seja, os valores obtidos r=0,166; r=0,017; r= -0,002, respetivamente. Quanto à validade de construção, constatou-se que o valor da mediana é mais elevado nos itens 1 a 5, logo sugere que haja diferenças entre o grupo de indivíduos pós-AVE e o grupo de indivíduos saudáveis (p<0,001). Entre os outros dois itens (6 e 7) não foram encontradas diferenças nas respostas nos dois grupos, sendo o valor de p > 0,001. Conclusão: Os resultados obtidos neste estudo sugerem que a versão portuguesa da TIS apresenta bons níveis de fiabilidade, consistência interna e também apresenta bons resultados no que refere à concordância inter-observadores.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatile organic compounds are a common source of groundwater contamination that can be easily removed by air stripping in columns with random packing and using a counter-current flow between the phases. This work proposes a new methodology for the column design for any particular type of packing and contaminant avoiding the necessity of a pre-defined diameter used in the classical approach. It also renders unnecessary the employment of the graphical Eckert generalized correlation for pressure drop estimates. The hydraulic features are previously chosen as a project criterion and only afterwards the mass transfer phenomena are incorporated, in opposition to conventional approach. The design procedure was translated into a convenient algorithm using C++ as programming language. A column was built in order to test the models used either in the design or in the simulation of the column performance. The experiments were fulfilled using a solution of chloroform in distilled water. Another model was built to simulate the operational performance of the column, both in steady state and in transient conditions. It consists in a system of two partial non linear differential equations (distributed parameters). Nevertheless, when flows are steady, the system became linear, although there is not an evident solution in analytical terms. In steady state the resulting system of ODE can be solved, allowing for the calculation of the concentration profile in both phases inside the column. In transient state the system of PDE was numerically solved by finite differences, after a previous linearization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Volatile organic compounds are a common source of groundwater contamination that can be easily removed by air stripping in columns with random packing and using a counter-current flow between the phases. This work proposes a new methodology for column design for any type of packing and contaminant which avoids the necessity of an arbitrary chosen diameter. It also avoids the employment of the usual graphical Eckert correlations for pressure drop. The hydraulic features are previously chosen as a project criterion. The design procedure was translated into a convenient algorithm in C++ language. A column was built in order to test the design, the theoretical steady-state and dynamic behaviour. The experiments were conducted using a solution of chloroform in distilled water. The results allowed for a correction in the theoretical global mass transfer coefficient previously estimated by the Onda correlations, which depend on several parameters that are not easy to control in experiments. For best describe the column behaviour in stationary and dynamic conditions, an original mathematical model was developed. It consists in a system of two partial non linear differential equations (distributed parameters). Nevertheless, when flows are steady, the system became linear, although there is not an evident solution in analytical terms. In steady state the resulting ODE can be solved by analytical methods, and in dynamic state the discretization of the PDE by finite differences allows for the overcoming of this difficulty. To estimate the contaminant concentrations in both phases in the column, a numerical algorithm was used. The high number of resulting algebraic equations and the impossibility of generating a recursive procedure did not allow the construction of a generalized programme. But an iterative procedure developed in an electronic worksheet allowed for the simulation. The solution is stable only for similar discretizations values. If different values for time/space discretization parameters are used, the solution easily becomes unstable. The system dynamic behaviour was simulated for the common liquid phase perturbations: step, impulse, rectangular pulse and sinusoidal. The final results do not configure strange or non-predictable behaviours.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Trabalho de Projeto apresentado ao Instituto de Contabilidade e Administração do Porto para a obtenção do grau de Mestre em Auditoria Orientação: Doutora Alcina Augusta de Sena Portugal Dias Coorientação: Doutora Amélia Cristina Ferreira Silva

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Artigo científico disponível actualmente em Early View (Online Version of Record published before inclusion in an issue)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In cluster analysis, it can be useful to interpret the partition built from the data in the light of external categorical variables which are not directly involved to cluster the data. An approach is proposed in the model-based clustering context to select a number of clusters which both fits the data well and takes advantage of the potential illustrative ability of the external variables. This approach makes use of the integrated joint likelihood of the data and the partitions at hand, namely the model-based partition and the partitions associated to the external variables. It is noteworthy that each mixture model is fitted by the maximum likelihood methodology to the data, excluding the external variables which are used to select a relevant mixture model only. Numerical experiments illustrate the promising behaviour of the derived criterion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Beam-like structures are the most common components in real engineering, while single side damage is often encountered. In this study, a numerical analysis of single side damage in a free-free beam is analysed with three different finite element models; namely solid, shell and beam models for demonstrating their performance in simulating real structures. Similar to experiment, damage is introduced into one side of the beam, and natural frequencies are extracted from the simulations and compared with experimental and analytical results. Mode shapes are also analysed with modal assurance criterion. The results from simulations reveal a good performance of the three models in extracting natural frequencies, and solid model performs better than shell while shell model performs better than beam model under intact state. For damaged states, the natural frequencies captured from solid model show more sensitivity to damage severity than shell model and shell model performs similar to the beam model in distinguishing damage. The main contribution of this paper is to perform a comparison between three finite element models and experimental data as well as analytical solutions. The finite element results show a relatively well performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Feature discretization (FD) techniques often yield adequate and compact representations of the data, suitable for machine learning and pattern recognition problems. These representations usually decrease the training time, yielding higher classification accuracy while allowing for humans to better understand and visualize the data, as compared to the use of the original features. This paper proposes two new FD techniques. The first one is based on the well-known Linde-Buzo-Gray quantization algorithm, coupled with a relevance criterion, being able perform unsupervised, supervised, or semi-supervised discretization. The second technique works in supervised mode, being based on the maximization of the mutual information between each discrete feature and the class label. Our experimental results on standard benchmark datasets show that these techniques scale up to high-dimensional data, attaining in many cases better accuracy than existing unsupervised and supervised FD approaches, while using fewer discretization intervals.