963 resultados para Applicant criterion


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research on the problem of feature selection for clustering continues to develop. This is a challenging task, mainly due to the absence of class labels to guide the search for relevant features. Categorical feature selection for clustering has rarely been addressed in the literature, with most of the proposed approaches having focused on numerical data. In this work, we propose an approach to simultaneously cluster categorical data and select a subset of relevant features. Our approach is based on a modification of a finite mixture model (of multinomial distributions), where a set of latent variables indicate the relevance of each feature. To estimate the model parameters, we implement a variant of the expectation-maximization algorithm that simultaneously selects the subset of relevant features, using a minimum message length criterion. The proposed approach compares favourably with two baseline methods: a filter based on an entropy measure and a wrapper based on mutual information. The results obtained on synthetic data illustrate the ability of the proposed expectation-maximization method to recover ground truth. An application to real data, referred to official statistics, shows its usefulness.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cluster analysis for categorical data has been an active area of research. A well-known problem in this area is the determination of the number of clusters, which is unknown and must be inferred from the data. In order to estimate the number of clusters, one often resorts to information criteria, such as BIC (Bayesian information criterion), MML (minimum message length, proposed by Wallace and Boulton, 1968), and ICL (integrated classification likelihood). In this work, we adopt the approach developed by Figueiredo and Jain (2002) for clustering continuous data. They use an MML criterion to select the number of clusters and a variant of the EM algorithm to estimate the model parameters. This EM variant seamlessly integrates model estimation and selection in a single algorithm. For clustering categorical data, we assume a finite mixture of multinomial distributions and implement a new EM algorithm, following a previous version (Silvestre et al., 2008). Results obtained with synthetic datasets are encouraging. The main advantage of the proposed approach, when compared to the above referred criteria, is the speed of execution, which is especially relevant when dealing with large data sets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Os desafios à engenharia moderna são cada vez maiores, pretendendo-se quase sempre obter estruturas mais leves, com propriedades mecânicas atrativas e muitas vezes com geometrias complexas. Com tais requisitos, um dos materiais que tem vindo a ter uma crescente aplicação é o material compósito. Contudo, no que toca ao cálculo estrutural destes materiais, tudo se torna mais complexo, já que são materiais que geralmente são formados por empilhamento de várias camadas de material heterogéneo, podendo estas encontrarem-se dispostas segundo diferentes orientações. Assim, a utilização de um software que permita a previsão das propriedades mecânicas de uma estrutura em material compósito através da micromecânica, a aplicação da Teoria Clássica dos Laminados e de um critério de rotura, como por exemplo o de Tsai-Hill, é fundamental para agilizar o processo de estudo da estrutura a fabricar. Para dar uma resposta a tal necessidade foi desenvolvida uma aplicação, em MATLAB® GUI, denominada CAFE – Composite Analysis For Engineers, com ambiente gráfico apelativo, que permite determinar todas as variáveis importantes no estudo de estruturas em material compósito. Esta aplicação visa suportar e agilizar a aprendizagem desta área do conhecimento, permitindo também o acesso ao código de cálculo por parte do utilizador, de modo a conhecerem-se as equações utilizadas e, eventualmente, ser alvo de futuros desenvolvimentos. O programa desenvolvido foi alvo de validação, recorrendo-se para tal, a uma comparação dos resultados obtidos entre o respetivo programa e por um outro programa de grande fiabilidade. Assim sendo, concluiu-se que o software CAFE apresenta resultados válidos, encontrando-se apto a ser utilizado.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study compared the ground reaction forces (GRF) and plantar pressures between unloaded and occasional loaded gait. The GRF and plantar pressures of 60 participants were recorded during unloaded gait and occasional loaded gait (wearing a backpack that raised their body mass index to 30); this load criterion was adopted because is considered potentially harmful in permanent loaded gait (obese people). The results indicate an overall increase (absolute values) of GRF and plantar pressures during occasional loaded gait (p < 0.05); also, higher normalized (by total weight) values in the medial midfoot and toes, and lower values in the lateral rearfoot region were observed. During loaded gait the magnitude of the vertical GRF (impact and thrust maximum) decreased and the shear forces increased more than did the proportion of the load (normalized values). These data suggest a different pattern of GRF and plantar pressure distribution during occasional loaded compared to unloaded gait.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The basic motivation of this work was the integration of biophysical models within the interval constraints framework for decision support. Comparing the major features of biophysical models with the expressive power of the existing interval constraints framework, it was clear that the most important inadequacy was related with the representation of differential equations. System dynamics is often modelled through differential equations but there was no way of expressing a differential equation as a constraint and integrate it within the constraints framework. Consequently, the goal of this work is focussed on the integration of ordinary differential equations within the interval constraints framework, which for this purpose is extended with the new formalism of Constraint Satisfaction Differential Problems. Such framework allows the specification of ordinary differential equations, together with related information, by means of constraints, and provides efficient propagation techniques for pruning the domains of their variables. This enabled the integration of all such information in a single constraint whose variables may subsequently be used in other constraints of the model. The specific method used for pruning its variable domains can then be combined with the pruning methods associated with the other constraints in an overall propagation algorithm for reducing the bounds of all model variables. The application of the constraint propagation algorithm for pruning the variable domains, that is, the enforcement of local-consistency, turned out to be insufficient to support decision in practical problems that include differential equations. The domain pruning achieved is not, in general, sufficient to allow safe decisions and the main reason derives from the non-linearity of the differential equations. Consequently, a complementary goal of this work proposes a new strong consistency criterion, Global Hull-consistency, particularly suited to decision support with differential models, by presenting an adequate trade-of between domain pruning and computational effort. Several alternative algorithms are proposed for enforcing Global Hull-consistency and, due to their complexity, an effort was made to provide implementations able to supply any-time pruning results. Since the consistency criterion is dependent on the existence of canonical solutions, it is proposed a local search approach that can be integrated with constraint propagation in continuous domains and, in particular, with the enforcing algorithms for anticipating the finding of canonical solutions. The last goal of this work is the validation of the approach as an important contribution for the integration of biophysical models within decision support. Consequently, a prototype application that integrated all the proposed extensions to the interval constraints framework is developed and used for solving problems in different biophysical domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this work was to investigate the application of experimental design techniques for the identification of Michaelis-Menten kinetic parameters. More specifically, this study attempts to elucidate the relative advantages/disadvantages of employing complex experimental design techniques in relation to equidistant sampling when applied to different reactor operation modes. All studies were supported by simulation data of a generic enzymatic process that obeys to the Michaelis-Menten kinetic equation. Different aspects were investigated, such as the influence of the reactor operation mode (batch, fed-batch with pulse wise feeding and fed-batch with continuous feeding) and the experimental design optimality criteria on the effectiveness of kinetic parameters identification. The following experimental design optimality criteria were investigated: 1) minimization of the sum of the diagonal of the Fisher information matrix (FIM) inverse (A-criterion), 2) maximization of the determinant of the FIM (D-criterion), 3) maximization of the smallest eigenvalue of the FIM (E-criterion) and 4) minimization of the quotient between the largest and the smallest eigenvalue (modified E-criterion). The comparison and assessment of the different methodologies was made on the basis of the Cramér-Rao lower bounds (CRLB) error in respect to the parameters vmax and Km of the Michaelis-Menten kinetic equation. In what concerns the reactor operation mode, it was concluded that fed-batch (pulses) is better than batch operation for parameter identification. When the former operation mode is adopted, the vmax CRLB error is lowered by 18.6 % while the Km CRLB error is lowered by 26.4 % when compared to the batch operation mode. Regarding the optimality criteria, the best method was the A-criterion, with an average vmax CRLB of 6.34 % and 5.27 %, for batch and fed-batch (pulses), respectively, while presenting a Km’s CRLB of 25.1 % and 18.1 %, for batch and fed-batch (pulses), respectively. As a general conclusion of the present study, it can be stated that experimental design is justified if the starting parameters CRLB errors are inferior to 19.5 % (vmax) and 45% (Km), for batch processes, and inferior to 42 % and to 50% for fed-batch (pulses) process. Otherwise equidistant sampling is a more rational decision. This conclusion clearly supports that, for fed-batch operation, the use of experimental design is likely to largely improve the identification of Michaelis-Menten kinetic parameters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho tem como objetivo intervir na área de Recursos Humanos na Entidade Acolhedora do Projeto. Foi neste contexto que identificamos o Centro Social e Paroquial de S. Martinho de Brufe para a sua realização. O diagnóstico realizado permitiu identificar como potencialidade de intervenção o Sistema de Gestão de Recursos Humanos. Considerando as exigências definidas pelo Modelo de Avaliação da Qualidade das Respostas Sociais (MAQRS) procedeu-se ao diagnóstico da organização acolhedora do projeto. Seguiu-se a configuração exata da potencialidade identificada, o planeamento estratégico e operacional da estratégia. A fase seguinte envolveu a implementação do projeto. Terminamos com a avaliação e apresentação das respetivas medidas necessárias para concretizar da finalidade a que nos propusemos. Os resultados da avaliação permitem concluir que o planeamento e a implementação do projeto foram eficientes e eficazes, uma vez que a auditoria final mostrou a inexistência de não conformidades no projeto de intervenção. Sendo finalidade do projeto garantir que o Centro Social e Paroquial de S. Martinho de Brufe cumpre todos os requisitos do Critério 2 – Pessoas, do Modelo de Avaliação da Qualidade das Respostas Sociais (MAQRS), do Instituto da Segurança Social para submeter com êxito o processo de certificação, em julho de 2014, o documento que se segue contém todos os procedimentos necessários para garantir êxito na sua concretização. O centro Social e Paroquial de S. Martinho de Brufe dispõe dos próximos seis meses (de janeiro a junho de 2014) para apresentar evidências da formalização, sendo esta também condição necessária que antecede a submissão do processo de certificação.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To analyze evidence of the validity and reliability of a Brazilian Portuguese version of the Quality of Care Scale from the perspective of people with physical and intellectual disabilities.METHODS There were 162 people with physical disabilities and 156 with intellectual disabilities from Porto Alegre and metropolitan region, who participated in the study in 2008. Classical psychometrics was used to independently analyze the two samples. Hypotheses for evidence of criterion validity (concurrent type) were tested with the Mann-Whitney test for non-normal distributions. Principal components analysis was used to explore factorial models. Evidence of reliability was calculated with Cronbach alpha for the scales and subscales. Test-retest reliability was analyzed for individuals with intellectual disabilities through intra-class correlation coefficient and the Willcoxon test.RESULTS The principal components in the group with physical disabilities replicated the original model presented as a solution to the international project data. Evidence of discriminant validity and test-retest reliability was found.CONCLUSIONS The transcultural factor model found within the international sample project seems appropriate for the samples investigated in this study, especially the physical disabilities group. Depression, pain, satisfaction with life and disability may play a mediating role in the evaluation of quality of care. Additional research is needed to add to evidence of the validity of the instruments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discrete data representations are necessary, or at least convenient, in many machine learning problems. While feature selection (FS) techniques aim at finding relevant subsets of features, the goal of feature discretization (FD) is to find concise (quantized) data representations, adequate for the learning task at hand. In this paper, we propose two incremental methods for FD. The first method belongs to the filter family, in which the quality of the discretization is assessed by a (supervised or unsupervised) relevance criterion. The second method is a wrapper, where discretized features are assessed using a classifier. Both methods can be coupled with any static (unsupervised or supervised) discretization procedure and can be used to perform FS as pre-processing or post-processing stages. The proposed methods attain efficient representations suitable for binary and multi-class problems with different types of data, being competitive with existing methods. Moreover, using well-known FS methods with the features discretized by our techniques leads to better accuracy than with the features discretized by other methods or with the original features. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To develop a model for evaluating the efficacy of drug-dispensing service in primary health care. METHODS An efficacy criterion was adopted to determine the level of achievement of the service objectives. The evaluation model was developed on the basis of a literature search and discussions with experts. The applicability test of the model was conducted in 15 primary health care units in the city of Florianópolis, state of Santa Catarina, in 2010, and data were recorded in structured and pretested questionnaires. RESULTS The model developed was evaluated using five dimensions of analysis for analysis. The model was suitable for evaluating service efficacy and helped to identify the critical points of each service dimension. CONCLUSIONS Adaptations to the data collection technique may be required to adjust for the reality and needs of each situation. The evaluation of the drug-dispensing service should promote adequate access to medications supplied through the public health system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work reports on an experimental and finite element method (FEM) parametric study of adhesively-bonded single and double-strap repairs on carbon-epoxy structures under buckling unrestrained compression. The influence of the overlap length and patch thickness was evaluated. This loading gains a particular significance from the additional characteristic mechanisms of structures under compression, such as fibres microbuckling, for buckling restrained structures, or global buckling of the assembly, if no transverse restriction exists. The FEM analysis is based on the use of cohesive elements including mixed-mode criteria to simulate a cohesive fracture of the adhesive layer. Trapezoidal laws in pure modes I and II were used to account for the ductility of most structural adhesives. These laws were estimated for the adhesive used from double cantilever beam (DCB) and end-notched flexure (ENF) tests, respectively, using an inverse technique. The pure mode III cohesive law was equalled to the pure mode II one. Compression failure in the laminates was predicted using a stress-based criterion. The accurate FEM predictions open a good prospect for the reduction of the extensive experimentation in the design of carbon-epoxy repairs. Design principles were also established for these repairs under buckling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE To validate a screening instrument using self-reported assessment of frailty syndrome in older adults.METHODS This cross-sectional study used data from the Saúde, Bem-estar e Envelhecimento study conducted in Sao Paulo, SP, Southeastern Brazil. The sample consisted of 433 older adult individuals (≥ 75 years) assessed in 2009. The self-reported instrument can be applied to older adults or their proxy respondents and consists of dichotomous questions directly related to each component of the frailty phenotype, which is considered the gold standard model: unintentional weight loss, fatigue, low physical activity, decreased physical strength, and decreased walking speed. The same classification proposed in the phenotype was utilized: not frail (no component identified); pre-frail (presence of one or two components), and frail (presence of three or more components). Because this is a screening instrument, “process of frailty” was included as a category (pre-frail and frail). Cronbach’s α was used in psychometric analysis to evaluate the reliability and validity of the criterion, the sensitivity, the specificity, as well as positive and negative predictive values. Factor analysis was used to assess the suitability of the proposed number of components.RESULTS Decreased walking speed and decreased physical strength showed good internal consistency (α = 0.77 and 0.72, respectively); however, low physical activity was less satisfactory (α = 0.63). The sensitivity and specificity for identifying pre-frail individuals were 89.7% and 24.3%, respectively, while those for identifying frail individuals were 63.2% and 71.6%, respectively. In addition, 89.7% of the individuals from both the evaluations were identified in the “process of frailty” category.CONCLUSIONS The self-reported assessment of frailty can identify the syndrome among older adults and can be used as a screening tool. Its advantages include simplicity, rapidity, low cost, and ability to be used by different professionals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The trajectory planning of redundant robots is an important area of research and efficient optimization algorithms are needed. The pseudoinverse control is not repeatable, causing drift in joint space which is undesirable for physical control. This paper presents a new technique that combines the closed-loop pseudoinverse method with genetic algorithms, leading to an optimization criterion for repeatable control of redundant manipulators, and avoiding the joint angle drift problem. Computer simulations performed based on redundant and hyper-redundant planar manipulators show that, when the end-effector traces a closed path in the workspace, the robot returns to its initial configuration. The solution is repeatable for a workspace with and without obstacles in the sense that, after executing several cycles, the initial and final states of the manipulator are very close.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In cluster analysis, it can be useful to interpret the partition built from the data in the light of external categorical variables which are not directly involved to cluster the data. An approach is proposed in the model-based clustering context to select a number of clusters which both fits the data well and takes advantage of the potential illustrative ability of the external variables. This approach makes use of the integrated joint likelihood of the data and the partitions at hand, namely the model-based partition and the partitions associated to the external variables. It is noteworthy that each mixture model is fitted by the maximum likelihood methodology to the data, excluding the external variables which are used to select a relevant mixture model only. Numerical experiments illustrate the promising behaviour of the derived criterion. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

European Transactions on Telecommunications, vol. 18