967 resultados para Modified truncation approach
Resumo:
Radial basis functions are being used in different scientific areas in order to reproduce the geometrical modeling of an object/structure, as well as to predict its behavior. Due to its characteristics, these functions are well suited for meshfree modeling of physical quantities, which for instances can be associated to the data sets of 3D laser scanning point clouds. In the present work the geometry of a structure is modeled by using multiquadric radial basis functions, and its configuration is further optimized in order to obtain better performances concerning to its static and dynamic behavior. For this purpose the authors consider the particle swarm optimization technique. A set of case studies is presented to illustrate the adequacy of the meshfree model used, as well as its link to particle swarm optimization technique. © 2014 IEEE.
Resumo:
Dissertação de Mestrado em Engenharia Informática
Resumo:
Biodiesel production by methanolysis of semi-refined rapeseed oil was studied over lime based catalysts. In order to improve the catalysts basicity a commercial CaO material was impregnated with aqueous solution of lithium nitrate (Li/Ca = 03 atomic ratio). The catalysts were calcined at 575 degrees C and 800 degrees C, for 5 h, to remove nitrate ions before reaction. The XRD patterns of the fresh catalysts, including the bare CaO, showed lines ascribable to CaO and Ca(OH)(2). The absence of XRD lines belonging to Li phases confirms the efficient dispersion of Li over CaO. In the tested condition (W-cat/W-oil = 5%; CH3OH/oil = 12 molar ratio) all the fresh catalysts provided similar biodiesel yields (FAME >93% after 4 h) but the bare CaO catalyst was more stable. The activity decay of the Li modified samples can be related to the enhanced, by the higher basicity, calcium diglyceroxide formation during methanolysis which promotes calcium leaching. The calcination temperature for Li modified catalysts plays an important role since encourages the crystals sinterization which appears to improve the catalyst stability. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Dissertation presented to obtain a Masters degree in Computer Science
Resumo:
OBJECTIVE To analyze the characteristics of health diagnosis according to the ecohealth approach in rural and urban communities in Mexico.METHODS Health diagnosis were conducted in La Nopalera, from December 2007 to October 2008, and in Atlihuayan, from December 2010 to October 2011. The research was based on three principles of the ecohealth approach: transdisciplinarity, community participation, gender and equity. To collect information, a joint methodology and several techniques were used to stimulate the participation of inhabitants. The diagnostic exercise was carried out in five phases that went from collecting information to prioritization of problems.RESULTS The constitution of the transdisciplinary team, as well as the participation of the population and the principle of gender/equity were differentials between the communities. In the rural community, the active participation of inhabitants and authorities was achieved and the principles of transdisciplinarity and gender/equity were incorporated.CONCLUSIONS With all the difficulties that entails the boost in participation, the incorporation of gender/equity and transdisciplinarity in health diagnosis allowed a holistic public health approach closer to the needs of the population.
Resumo:
Between 2000/01 and 2006/07, the approval rate of a Thermodynamics course in a Mechanical Engineer graduation was 25%. However, a careful analysis of the results showed that 41% of the students chosen not to attend or dropped out, missing the final examination. Thus, a continuous assessment methodology was developed, whose purpose was to reduce drop out, motivating students to attend this course, believing that what was observed was due, not to the incapacity to pass, but to the anticipation of the inevitability of failure by the students. If, on one hand, motivation is defined as a broad construct pertaining to the conditions and processes that account for the arousal, direction, magnitude, and maintenance of effort, on the other hand, assessment is one of the most powerful tools to change the will that students have to learn, motivating them to learn in a quicker and permanent way. Some of the practices that were implemented, included: promoting learning goal orientation rather than performance goal orientation; cultivating intrinsic interest in the subject and put less emphasis on grades but make grading criteria explicit; emphasizing teaching approaches that encourage collaboration among students and cater for a range of teaching styles; explaining the reasons for, and the implications of, tests; providing feedback to students about their performance in a form that is non-egoinvolving and non-judgemental and helping students to interpret it; broadening the range of information used in assessing the attainment of individual students. The continuous assessment methodology developed was applied in 2007/08 and 2008/09, having found an increase in the approval from 25% to 55% (30%), accompanied by a decrease of the drop out from 41% to 23,5% (17,5%). Flunking with a numerical grade lowered from 34,4% to 22,0% (12,4%). The perception by the students of the continuous assessment relevance was evaluated with a questionnaire. 70% of the students that failed the course respond that, nevertheless, didn’t repent having done the continuous assessment.
Resumo:
Finding the structure of a confined liquid crystal is a difficult task since both the density and order parameter profiles are nonuniform. Starting from a microscopic model and density-functional theory, one has to either (i) solve a nonlinear, integral Euler-Lagrange equation, or (ii) perform a direct multidimensional free energy minimization. The traditional implementations of both approaches are computationally expensive and plagued with convergence problems. Here, as an alternative, we introduce an unsupervised variant of the multilayer perceptron (MLP) artificial neural network for minimizing the free energy of a fluid of hard nonspherical particles confined between planar substrates of variable penetrability. We then test our algorithm by comparing its results for the structure (density-orientation profiles) and equilibrium free energy with those obtained by standard iterative solution of the Euler-Lagrange equations and with Monte Carlo simulation results. Very good agreement is found and the MLP method proves competitively fast, flexible, and refinable. Furthermore, it can be readily generalized to the richer experimental patterned-substrate geometries that are now experimentally realizable but very problematic to conventional theoretical treatments.
Resumo:
The trajectory planning of redundant robots is an important area of research and efficient optimization algorithms are needed. The pseudoinverse control is not repeatable, causing drift in joint space which is undesirable for physical control. This paper presents a new technique that combines the closed-loop pseudoinverse method with genetic algorithms, leading to an optimization criterion for repeatable control of redundant manipulators, and avoiding the joint angle drift problem. Computer simulations performed based on redundant and hyper-redundant planar manipulators show that, when the end-effector traces a closed path in the workspace, the robot returns to its initial configuration. The solution is repeatable for a workspace with and without obstacles in the sense that, after executing several cycles, the initial and final states of the manipulator are very close.
Resumo:
Dependability is a critical factor in computer systems, requiring high quality validation & verification procedures in the development stage. At the same time, digital devices are getting smaller and access to their internal signals and registers is increasingly complex, requiring innovative debugging methodologies. To address this issue, most recent microprocessors include an on-chip debug (OCD) infrastructure to facilitate common debugging operations. This paper proposes an enhanced OCD infrastructure with the objective of supporting the verification of fault-tolerant mechanisms through fault injection campaigns. This upgraded on-chip debug and fault injection (OCD-FI) infrastructure provides an efficient fault injection mechanism with improved capabilities and dynamic behavior. Preliminary results show that this solution provides flexibility in terms of fault triggering and allows high speed real-time fault injection in memory elements
Resumo:
Nowadays, the metagenomic approach has been a very important tool in the discovery of new viruses in environmental and biological samples. Here we discuss how these discoveries may help to elucidate the etiology of diseases and the criteria necessary to establish a causal association between a virus and a disease.
Resumo:
ABSTRACT OBJECTIVE To identify individual and hospital characteristics associated with the risk of readmission in older inpatients for proximal femoral fracture in the period of 90 days after discharge. METHODS Deaths and readmissions were obtained by a linkage of databases of the Hospital Information System of the Unified Health System and the System of Information on Mortality of the city of Rio de Janeiro from 2008 to 2011. The population of 3,405 individuals aged 60 or older, with non-elective hospitalization for proximal femoral fracture was followed for 90 days after discharge. Cox multilevel model was used for discharge time until readmission, and the characteristics of the patients were used on the first level and the characteristics of the hospitals on the second level. RESULTS The risk of readmission was higher for men (hazard ratio [HR] = 1.37; 95%CI 1.08–1.73), individuals more than 79 years old (HR = 1.45; 95%CI 1.06–1.98), patients who were hospitalized for more than two weeks (HR = 1.33; 95%CI 1.06-1.67), and for those who underwent arthroplasty when compared with the ones who underwent osteosynthesis (HR = 0.57; 95%CI 0.41–0.79). Besides, patients admitted to state hospitals had lower risk for readmission when compared with inpatients in municipal (HR = 1.71; 95%CI 1.09–2.68) and federal hospitals (HR = 1.81; 95%CI 1.00–3.27). The random effect of the hospitals in the adjusted model remained statistically significant (p < 0.05). CONCLUSIONS Hospitals have complex structures that reflect in the quality of care. Thus, we propose that future studies may include these complexities and the severity of the patients in the analysis of the data, also considering the correlation between readmission and mortality to reduce biases.
Resumo:
Many learning problems require handling high dimensional datasets with a relatively small number of instances. Learning algorithms are thus confronted with the curse of dimensionality, and need to address it in order to be effective. Examples of these types of data include the bag-of-words representation in text classification problems and gene expression data for tumor detection/classification. Usually, among the high number of features characterizing the instances, many may be irrelevant (or even detrimental) for the learning tasks. It is thus clear that there is a need for adequate techniques for feature representation, reduction, and selection, to improve both the classification accuracy and the memory requirements. In this paper, we propose combined unsupervised feature discretization and feature selection techniques, suitable for medium and high-dimensional datasets. The experimental results on several standard datasets, with both sparse and dense features, show the efficiency of the proposed techniques as well as improvements over previous related techniques.
Resumo:
Behavioral biometrics is one of the areas with growing interest within the biosignal research community. A recent trend in the field is ECG-based biometrics, where electrocardiographic (ECG) signals are used as input to the biometric system. Previous work has shown this to be a promising trait, with the potential to serve as a good complement to other existing, and already more established modalities, due to its intrinsic characteristics. In this paper, we propose a system for ECG biometrics centered on signals acquired at the subject's hand. Our work is based on a previously developed custom, non-intrusive sensing apparatus for data acquisition at the hands, and involved the pre-processing of the ECG signals, and evaluation of two classification approaches targeted at real-time or near real-time applications. Preliminary results show that this system leads to competitive results both for authentication and identification, and further validate the potential of ECG signals as a complementary modality in the toolbox of the biometric system designer.
Resumo:
Dissertação apresentada para obtenção do Grau de Doutor em Conservação e Restauro, especialidade Teoria, História e Técnicas, pela Universidade Nova de Lisboa, Faculdade de Ciências e Tecnologia
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies