966 resultados para irreducibility criterion


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção do grau de Mestre em Intervenção Precoce

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ECG signal has been shown to contain relevant information for human identification. Even though results validate the potential of these signals, data acquisition methods and apparatus explored so far compromise user acceptability, requiring the acquisition of ECG at the chest. In this paper, we propose a finger-based ECG biometric system, that uses signals collected at the fingers, through a minimally intrusive 1-lead ECG setup recurring to Ag/AgCl electrodes without gel as interface with the skin. The collected signal is significantly more noisy than the ECG acquired at the chest, motivating the application of feature extraction and signal processing techniques to the problem. Time domain ECG signal processing is performed, which comprises the usual steps of filtering, peak detection, heartbeat waveform segmentation, and amplitude normalization, plus an additional step of time normalization. Through a simple minimum distance criterion between the test patterns and the enrollment database, results have revealed this to be a promising technique for biometric applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: To translate the Need for Recovery Scale (NFR) into Brazilian Portuguese and culturally adapt it and assess the stability, internal consistency and convergent validity of the Brazilian scale among industrial workers. METHODS: The translation process followed the guidelines for cultural adaptation of questionnaires including the steps of translation, synthesis, back translation, expert committee review, and pre-testing. The Brazilian Portuguese NFR, final version (Br-NFR) was assessed for stability (n=52) and internal consistency (n=192) and for convergent validity through simultaneous assessment with other instruments: the Borg Scale (n=59); the Chalder Fatigue Questionnaire (n=57) and 3 subscales of the SF-36 (n=56). RESULTS: Stability and internal consistency met the criterion for a reliable measure (ICC=0.80 and Cronbach's alpha =0.87, respectively). The convergent validity between Br-NFR and other instruments also showed good results: Borg Scale (r= 0.64); Chalder Questionnaire (r= 0.67); SF-36 subscales: vitality (r= -0.84), physical functioning (r= -0.54), and role-physical (r= -0.47). CONCLUSIONS: The Br-NFR proved to be a reliable instrument to evaluate work-related fatigue symptoms in industrial workers. Furthermore, it showed significant and good correlations with well-established instruments such as the Borg Scale, the Chalder Questionnaire and SF-36 vitality subscale, supporting the validity of the Br-NFR.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O trabalho tem como objectivo a simulação e verificação do funcionamento de 3 colunas de destilação, a T-0303, a T-0306 e a T-0307, integrantes do processo de produção de p-xileno, baseado nos dados relativos ao ano de 2008, existente na refinaria da Galp no Porto. A abordagem consistiu em utilizar o AspenPlus quer para a simulação quer para a optimização, sendo esta última complementada com um planeamento experimental e optimização no Minitab15. O critério de optimização foi estabelecido a partir de uma análise ao processo actual, na qual se averiguou que se poderia, no limite: produzir mais 15,30ton.ano-1 de p-xileno no conjunto de colunas T-0306 e T-0307; remover mais 1,36ton.ano-1 de dessorvente na coluna T-0303 e diminuir a energia necessária para o processo. Da optimização à coluna T-0303, obteve-se uma melhoria de remoção de 0,34ton.ano-1 de dessorvente, e uma diminuição na energia necessária para 333,24.106kWh por ano. Para obter esta optimização houve necessidade de ultrapassar em 109,852kW a potência da bomba P0306A/S e alterou-se a razão de refluxo na base para 46,1. A optimização conjunta das colunas T-0306 e T-0307 apenas possibilita uma melhoria de p-xileno de 3,4ton.ano-1. De uma optimização individual da coluna T-0307, mantendo a coluna T-0306 nas condições actuais, obteve-se uma melhoria na produção de p-xileno de 14,62ton.ano-1. Neste ensaio as potências do condensador E-0314, do reebulidor E-0306 e da bomba P0314A/S excedem, as actuais em, respectivamente, 35,71kW, 35,74kW e 0,12kW. Enquanto para a situação actual o custo de p-xileno equivale a 722,17€.ton-1, para a optimização simultânea da coluna T-0303 e T-0307, é de 723,39€.ton-1 e para a optimização de apenas da coluna T-0307 é de 722,81€.ton-1. Perante um preço de venda actual de pxileno de 749,10€.ton-1 todas as situações são favoráveis. Em suma, é possível uma optimização processual mas o custo por tonelada de pxileno fica superior ao actual.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this paper is to develop models for experimental open-channel water delivery systems and assess the use of three data-driven modeling tools toward that end. Water delivery canals are nonlinear dynamical systems and thus should be modeled to meet given operational requirements while capturing all relevant dynamics, including transport delays. Typically, the derivation of first principle models for open-channel systems is based on the use of Saint-Venant equations for shallow water, which is a time-consuming task and demands for specific expertise. The present paper proposes and assesses the use of three data-driven modeling tools: artificial neural networks, composite local linear models and fuzzy systems. The canal from Hydraulics and Canal Control Nucleus (A parts per thousand vora University, Portugal) will be used as a benchmark: The models are identified using data collected from the experimental facility, and then their performances are assessed based on suitable validation criterion. The performance of all models is compared among each other and against the experimental data to show the effectiveness of such tools to capture all significant dynamics within the canal system and, therefore, provide accurate nonlinear models that can be used for simulation or control. The models are available upon request to the authors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on the problem of feature selection for clustering continues to develop. This is a challenging task, mainly due to the absence of class labels to guide the search for relevant features. Categorical feature selection for clustering has rarely been addressed in the literature, with most of the proposed approaches having focused on numerical data. In this work, we propose an approach to simultaneously cluster categorical data and select a subset of relevant features. Our approach is based on a modification of a finite mixture model (of multinomial distributions), where a set of latent variables indicate the relevance of each feature. To estimate the model parameters, we implement a variant of the expectation-maximization algorithm that simultaneously selects the subset of relevant features, using a minimum message length criterion. The proposed approach compares favourably with two baseline methods: a filter based on an entropy measure and a wrapper based on mutual information. The results obtained on synthetic data illustrate the ability of the proposed expectation-maximization method to recover ground truth. An application to real data, referred to official statistics, shows its usefulness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cluster analysis for categorical data has been an active area of research. A well-known problem in this area is the determination of the number of clusters, which is unknown and must be inferred from the data. In order to estimate the number of clusters, one often resorts to information criteria, such as BIC (Bayesian information criterion), MML (minimum message length, proposed by Wallace and Boulton, 1968), and ICL (integrated classification likelihood). In this work, we adopt the approach developed by Figueiredo and Jain (2002) for clustering continuous data. They use an MML criterion to select the number of clusters and a variant of the EM algorithm to estimate the model parameters. This EM variant seamlessly integrates model estimation and selection in a single algorithm. For clustering categorical data, we assume a finite mixture of multinomial distributions and implement a new EM algorithm, following a previous version (Silvestre et al., 2008). Results obtained with synthetic datasets are encouraging. The main advantage of the proposed approach, when compared to the above referred criteria, is the speed of execution, which is especially relevant when dealing with large data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os desafios à engenharia moderna são cada vez maiores, pretendendo-se quase sempre obter estruturas mais leves, com propriedades mecânicas atrativas e muitas vezes com geometrias complexas. Com tais requisitos, um dos materiais que tem vindo a ter uma crescente aplicação é o material compósito. Contudo, no que toca ao cálculo estrutural destes materiais, tudo se torna mais complexo, já que são materiais que geralmente são formados por empilhamento de várias camadas de material heterogéneo, podendo estas encontrarem-se dispostas segundo diferentes orientações. Assim, a utilização de um software que permita a previsão das propriedades mecânicas de uma estrutura em material compósito através da micromecânica, a aplicação da Teoria Clássica dos Laminados e de um critério de rotura, como por exemplo o de Tsai-Hill, é fundamental para agilizar o processo de estudo da estrutura a fabricar. Para dar uma resposta a tal necessidade foi desenvolvida uma aplicação, em MATLAB® GUI, denominada CAFE – Composite Analysis For Engineers, com ambiente gráfico apelativo, que permite determinar todas as variáveis importantes no estudo de estruturas em material compósito. Esta aplicação visa suportar e agilizar a aprendizagem desta área do conhecimento, permitindo também o acesso ao código de cálculo por parte do utilizador, de modo a conhecerem-se as equações utilizadas e, eventualmente, ser alvo de futuros desenvolvimentos. O programa desenvolvido foi alvo de validação, recorrendo-se para tal, a uma comparação dos resultados obtidos entre o respetivo programa e por um outro programa de grande fiabilidade. Assim sendo, concluiu-se que o software CAFE apresenta resultados válidos, encontrando-se apto a ser utilizado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study compared the ground reaction forces (GRF) and plantar pressures between unloaded and occasional loaded gait. The GRF and plantar pressures of 60 participants were recorded during unloaded gait and occasional loaded gait (wearing a backpack that raised their body mass index to 30); this load criterion was adopted because is considered potentially harmful in permanent loaded gait (obese people). The results indicate an overall increase (absolute values) of GRF and plantar pressures during occasional loaded gait (p < 0.05); also, higher normalized (by total weight) values in the medial midfoot and toes, and lower values in the lateral rearfoot region were observed. During loaded gait the magnitude of the vertical GRF (impact and thrust maximum) decreased and the shear forces increased more than did the proportion of the load (normalized values). These data suggest a different pattern of GRF and plantar pressure distribution during occasional loaded compared to unloaded gait.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The basic motivation of this work was the integration of biophysical models within the interval constraints framework for decision support. Comparing the major features of biophysical models with the expressive power of the existing interval constraints framework, it was clear that the most important inadequacy was related with the representation of differential equations. System dynamics is often modelled through differential equations but there was no way of expressing a differential equation as a constraint and integrate it within the constraints framework. Consequently, the goal of this work is focussed on the integration of ordinary differential equations within the interval constraints framework, which for this purpose is extended with the new formalism of Constraint Satisfaction Differential Problems. Such framework allows the specification of ordinary differential equations, together with related information, by means of constraints, and provides efficient propagation techniques for pruning the domains of their variables. This enabled the integration of all such information in a single constraint whose variables may subsequently be used in other constraints of the model. The specific method used for pruning its variable domains can then be combined with the pruning methods associated with the other constraints in an overall propagation algorithm for reducing the bounds of all model variables. The application of the constraint propagation algorithm for pruning the variable domains, that is, the enforcement of local-consistency, turned out to be insufficient to support decision in practical problems that include differential equations. The domain pruning achieved is not, in general, sufficient to allow safe decisions and the main reason derives from the non-linearity of the differential equations. Consequently, a complementary goal of this work proposes a new strong consistency criterion, Global Hull-consistency, particularly suited to decision support with differential models, by presenting an adequate trade-of between domain pruning and computational effort. Several alternative algorithms are proposed for enforcing Global Hull-consistency and, due to their complexity, an effort was made to provide implementations able to supply any-time pruning results. Since the consistency criterion is dependent on the existence of canonical solutions, it is proposed a local search approach that can be integrated with constraint propagation in continuous domains and, in particular, with the enforcing algorithms for anticipating the finding of canonical solutions. The last goal of this work is the validation of the approach as an important contribution for the integration of biophysical models within decision support. Consequently, a prototype application that integrated all the proposed extensions to the interval constraints framework is developed and used for solving problems in different biophysical domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this work was to investigate the application of experimental design techniques for the identification of Michaelis-Menten kinetic parameters. More specifically, this study attempts to elucidate the relative advantages/disadvantages of employing complex experimental design techniques in relation to equidistant sampling when applied to different reactor operation modes. All studies were supported by simulation data of a generic enzymatic process that obeys to the Michaelis-Menten kinetic equation. Different aspects were investigated, such as the influence of the reactor operation mode (batch, fed-batch with pulse wise feeding and fed-batch with continuous feeding) and the experimental design optimality criteria on the effectiveness of kinetic parameters identification. The following experimental design optimality criteria were investigated: 1) minimization of the sum of the diagonal of the Fisher information matrix (FIM) inverse (A-criterion), 2) maximization of the determinant of the FIM (D-criterion), 3) maximization of the smallest eigenvalue of the FIM (E-criterion) and 4) minimization of the quotient between the largest and the smallest eigenvalue (modified E-criterion). The comparison and assessment of the different methodologies was made on the basis of the Cramér-Rao lower bounds (CRLB) error in respect to the parameters vmax and Km of the Michaelis-Menten kinetic equation. In what concerns the reactor operation mode, it was concluded that fed-batch (pulses) is better than batch operation for parameter identification. When the former operation mode is adopted, the vmax CRLB error is lowered by 18.6 % while the Km CRLB error is lowered by 26.4 % when compared to the batch operation mode. Regarding the optimality criteria, the best method was the A-criterion, with an average vmax CRLB of 6.34 % and 5.27 %, for batch and fed-batch (pulses), respectively, while presenting a Km’s CRLB of 25.1 % and 18.1 %, for batch and fed-batch (pulses), respectively. As a general conclusion of the present study, it can be stated that experimental design is justified if the starting parameters CRLB errors are inferior to 19.5 % (vmax) and 45% (Km), for batch processes, and inferior to 42 % and to 50% for fed-batch (pulses) process. Otherwise equidistant sampling is a more rational decision. This conclusion clearly supports that, for fed-batch operation, the use of experimental design is likely to largely improve the identification of Michaelis-Menten kinetic parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho tem como objetivo intervir na área de Recursos Humanos na Entidade Acolhedora do Projeto. Foi neste contexto que identificamos o Centro Social e Paroquial de S. Martinho de Brufe para a sua realização. O diagnóstico realizado permitiu identificar como potencialidade de intervenção o Sistema de Gestão de Recursos Humanos. Considerando as exigências definidas pelo Modelo de Avaliação da Qualidade das Respostas Sociais (MAQRS) procedeu-se ao diagnóstico da organização acolhedora do projeto. Seguiu-se a configuração exata da potencialidade identificada, o planeamento estratégico e operacional da estratégia. A fase seguinte envolveu a implementação do projeto. Terminamos com a avaliação e apresentação das respetivas medidas necessárias para concretizar da finalidade a que nos propusemos. Os resultados da avaliação permitem concluir que o planeamento e a implementação do projeto foram eficientes e eficazes, uma vez que a auditoria final mostrou a inexistência de não conformidades no projeto de intervenção. Sendo finalidade do projeto garantir que o Centro Social e Paroquial de S. Martinho de Brufe cumpre todos os requisitos do Critério 2 – Pessoas, do Modelo de Avaliação da Qualidade das Respostas Sociais (MAQRS), do Instituto da Segurança Social para submeter com êxito o processo de certificação, em julho de 2014, o documento que se segue contém todos os procedimentos necessários para garantir êxito na sua concretização. O centro Social e Paroquial de S. Martinho de Brufe dispõe dos próximos seis meses (de janeiro a junho de 2014) para apresentar evidências da formalização, sendo esta também condição necessária que antecede a submissão do processo de certificação.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To analyze evidence of the validity and reliability of a Brazilian Portuguese version of the Quality of Care Scale from the perspective of people with physical and intellectual disabilities.METHODS There were 162 people with physical disabilities and 156 with intellectual disabilities from Porto Alegre and metropolitan region, who participated in the study in 2008. Classical psychometrics was used to independently analyze the two samples. Hypotheses for evidence of criterion validity (concurrent type) were tested with the Mann-Whitney test for non-normal distributions. Principal components analysis was used to explore factorial models. Evidence of reliability was calculated with Cronbach alpha for the scales and subscales. Test-retest reliability was analyzed for individuals with intellectual disabilities through intra-class correlation coefficient and the Willcoxon test.RESULTS The principal components in the group with physical disabilities replicated the original model presented as a solution to the international project data. Evidence of discriminant validity and test-retest reliability was found.CONCLUSIONS The transcultural factor model found within the international sample project seems appropriate for the samples investigated in this study, especially the physical disabilities group. Depression, pain, satisfaction with life and disability may play a mediating role in the evaluation of quality of care. Additional research is needed to add to evidence of the validity of the instruments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discrete data representations are necessary, or at least convenient, in many machine learning problems. While feature selection (FS) techniques aim at finding relevant subsets of features, the goal of feature discretization (FD) is to find concise (quantized) data representations, adequate for the learning task at hand. In this paper, we propose two incremental methods for FD. The first method belongs to the filter family, in which the quality of the discretization is assessed by a (supervised or unsupervised) relevance criterion. The second method is a wrapper, where discretized features are assessed using a classifier. Both methods can be coupled with any static (unsupervised or supervised) discretization procedure and can be used to perform FS as pre-processing or post-processing stages. The proposed methods attain efficient representations suitable for binary and multi-class problems with different types of data, being competitive with existing methods. Moreover, using well-known FS methods with the features discretized by our techniques leads to better accuracy than with the features discretized by other methods or with the original features. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To develop a model for evaluating the efficacy of drug-dispensing service in primary health care. METHODS An efficacy criterion was adopted to determine the level of achievement of the service objectives. The evaluation model was developed on the basis of a literature search and discussions with experts. The applicability test of the model was conducted in 15 primary health care units in the city of Florianópolis, state of Santa Catarina, in 2010, and data were recorded in structured and pretested questionnaires. RESULTS The model developed was evaluated using five dimensions of analysis for analysis. The model was suitable for evaluating service efficacy and helped to identify the critical points of each service dimension. CONCLUSIONS Adaptations to the data collection technique may be required to adjust for the reality and needs of each situation. The evaluation of the drug-dispensing service should promote adequate access to medications supplied through the public health system.