96 resultados para Portlet-based application
Resumo:
Aquest projecte pretén ser una solució puntual en l'entorn de la gestió eficient dels serveis proporcionats per a una petita empresa, aportant una solució en forma d'aplicació Web 2.0 basada en l'àrea J2EE. Les J2EE són tecnologies basades en Java que proporcionen solucions empresarials, traient el màxim profit de l'orientació a objectes, amb patrons de disseny i bastiments (Framework) com Spring Framework, amb la gestió d'accés a dades (DAO) a través del framework Hibernate. La presentació a l'usuari es fa a través de JSP utilitzant JSTL i incorporant les tecnologies revolucionàries JQuery i Ajax per proporcionar un aspecte visual en l'entorn de les noves aplicacions web.
Resumo:
This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV
Resumo:
Precision of released figures is not only an important quality feature of official statistics,it is also essential for a good understanding of the data. In this paper we show a casestudy of how precision could be conveyed if the multivariate nature of data has to betaken into account. In the official release of the Swiss earnings structure survey, the totalsalary is broken down into several wage components. We follow Aitchison's approachfor the analysis of compositional data, which is based on logratios of components. Wefirst present diferent multivariate analyses of the compositional data whereby the wagecomponents are broken down by economic activity classes. Then we propose a numberof ways to assess precision
Resumo:
Fault location has been studied deeply for transmission lines due to its importance in power systems. Nowadays the problem of fault location on distribution systems is receiving special attention mainly because of the power quality regulations. In this context, this paper presents an application software developed in Matlabtrade that automatically calculates the location of a fault in a distribution power system, starting from voltages and currents measured at the line terminal and the model of the distribution power system data. The application is based on a N-ary tree structure, which is suitable to be used in this application due to the highly branched and the non- homogeneity nature of the distribution systems, and has been developed for single-phase, two-phase, two-phase-to-ground, and three-phase faults. The implemented application is tested by using fault data in a real electrical distribution power system
Resumo:
The paper focuses on taking advantage of large amounts of data that are systematically stored in plants (by means of SCADA systems), but not exploited enough in order to achieve supervisory goals (fault detection, diagnosis and reconfiguration). The methodology of case base reasoning (CBR) is proposed to perform supervisory tasks in industrial processes by re-using the stored data. The goal is to take advantage of experiences, registered in a suitable structure as cam, avoiding the tedious task of knowledge acquisition and representation needed by other reasoning techniques as expert systems. An outlook of CBR terminology and basic concepts are presented. The adaptation of CBR in performing expert supervisory tasks, taking into account the particularities and difficulties derived from dynamic systems, is discussed. A special interest is focused in proposing a general case definition suitable for supervisory tasks. Finally, this structure and the whole methodology is tested in a application example for monitoring a real drier chamber
Resumo:
The composition of the labour force is an important economic factor for a country.Often the changes in proportions of different groups are of interest.I this paper we study a monthly compositional time series from the Swedish LabourForce Survey from 1994 to 2005. Three models are studied: the ILR-transformed series,the ILR-transformation of the compositional differenced series of order 1, and the ILRtransformationof the compositional differenced series of order 12. For each of thethree models a VAR-model is fitted based on the data 1994-2003. We predict the timeseries 15 steps ahead and calculate 95 % prediction regions. The predictions of thethree models are compared with actual values using MAD and MSE and the predictionregions are compared graphically in a ternary time series plot.We conclude that the first, and simplest, model possesses the best predictive power ofthe three models
Resumo:
This paper addresses the application of a PCA analysis on categorical data prior to diagnose a patients data set using a Case-Based Reasoning (CBR) system. The particularity is that the standard PCA techniques are designed to deal with numerical attributes, but our medical data set contains many categorical data and alternative methods as RS-PCA are required. Thus, we propose to hybridize RS-PCA (Regular Simplex PCA) and a simple CBR. Results show how the hybrid system produces similar results when diagnosing a medical data set, that the ones obtained when using the original attributes. These results are quite promising since they allow to diagnose with less computation effort and memory storage
Resumo:
In CoDaWork’05, we presented an application of discriminant function analysis (DFA) to 4 differentcompositional datasets and modelled the first canonical variable using a segmented regression modelsolely based on an observation about the scatter plots. In this paper, multiple linear regressions areapplied to different datasets to confirm the validity of our proposed model. In addition to dating theunknown tephras by calibration as discussed previously, another method of mapping the unknown tephrasinto samples of the reference set or missing samples in between consecutive reference samples isproposed. The application of these methodologies is demonstrated with both simulated and real datasets.This new proposed methodology provides an alternative, more acceptable approach for geologists as theirfocus is on mapping the unknown tephra with relevant eruptive events rather than estimating the age ofunknown tephra.Kew words: Tephrochronology; Segmented regression
Resumo:
Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services
Resumo:
Epoxidization is an interesting way to develop a new application of lignin and therefore to improve its application potential. In this work, kraft lignin-based epoxy resins were obtained by the epoxidization reaction, using the kraft lignin recovered directly from pulping liquor and modified by a methylolation reaction. The methylolated lignins were obtained by the reaction of original kraft lignin with formaldehyde and glyoxal, which is a less volatile and less toxic aldehyde. 1H-NMR spectroscopy showed that methylolated kraft lignin has more hydroxymethyl groups than glyoxalated kraft lignin. For the epoxidization reaction we studied the influence of the lignin:NaOH (w/w) ratio, temperature, and time of the reaction on the properties of the prepared epoxidized lignins. The structures of lignin-based epoxy resins were followed by epoxy index test and FTIR spectroscopy. Optimal conditions were obtained for lignin-based epoxy resin produced at lignin/NaOH = 1/3 at 70 ºC for 3h. Thermogravimetry analysis (TGA) revealed that the epoxidization enhances the thermal stability of lignins and may allow a wider temperature range for applications with lignin epoxy-PF blends
Resumo:
Black-box optimization problems (BBOP) are de ned as those optimization problems in which the objective function does not have an algebraic expression, but it is the output of a system (usually a computer program). This paper is focussed on BBOPs that arise in the eld of insurance, and more speci cally in reinsurance problems. In this area, the complexity of the models and assumptions considered to de ne the reinsurance rules and conditions produces hard black-box optimization problems, that must be solved in order to obtain the optimal output of the reinsurance. The application of traditional optimization approaches is not possible in BBOP, so new computational paradigms must be applied to solve these problems. In this paper we show the performance of two evolutionary-based techniques (Evolutionary Programming and Particle Swarm Optimization). We provide an analysis in three BBOP in reinsurance, where the evolutionary-based approaches exhibit an excellent behaviour, nding the optimal solution within a fraction of the computational cost used by inspection or enumeration methods.
Resumo:
Selected configuration interaction (SCI) for atomic and molecular electronic structure calculations is reformulated in a general framework encompassing all CI methods. The linked cluster expansion is used as an intermediate device to approximate CI coefficients BK of disconnected configurations (those that can be expressed as products of combinations of singly and doubly excited ones) in terms of CI coefficients of lower-excited configurations where each K is a linear combination of configuration-state-functions (CSFs) over all degenerate elements of K. Disconnected configurations up to sextuply excited ones are selected by Brown's energy formula, ΔEK=(E-HKK)BK2/(1-BK2), with BK determined from coefficients of singly and doubly excited configurations. The truncation energy error from disconnected configurations, Δdis, is approximated by the sum of ΔEKS of all discarded Ks. The remaining (connected) configurations are selected by thresholds based on natural orbital concepts. Given a model CI space M, a usual upper bound ES is computed by CI in a selected space S, and EM=E S+ΔEdis+δE, where δE is a residual error which can be calculated by well-defined sensitivity analyses. An SCI calculation on Ne ground state featuring 1077 orbitals is presented. Convergence to within near spectroscopic accuracy (0.5 cm-1) is achieved in a model space M of 1.4× 109 CSFs (1.1 × 1012 determinants) containing up to quadruply excited CSFs. Accurate energy contributions of quintuples and sextuples in a model space of 6.5 × 1012 CSFs are obtained. The impact of SCI on various orbital methods is discussed. Since ΔEdis can readily be calculated for very large basis sets without the need of a CI calculation, it can be used to estimate the orbital basis incompleteness error. A method for precise and efficient evaluation of ES is taken up in a companion paper
Resumo:
Projecte de recerca elaborat a partir d’una estada a la University of British Columbia, Canadà, entre 2010 i 2012 La malaltia d'Alzheimer (MA) representa avui la forma més comuna de demència en la població envellida. Malgrat fa 100 anys que va ser descoberta, encara avui no existeix cap tractament preventiu i/o curatiu ni cap agent de diagnòstic que permeti valorar quantitativament l'evolució d'aquesta malaltia. L'objectiu en el que s'emmarca aquest treball és contribuir a aportar solucions al problema de la manca d'agents terapèutics i de diagnosi, unívocs i rigorosos, per a la MA. Des del camp de la química bioinorgànica és fàcil fixar-se en l'excessiva concentració d'ions Zn(II) i Cu(II) en els cervells de malalts de MA, plantejar-se la seva utilització com a dianes terapèutica i, en conseqüència, cercar agents quelants que evitin la formació de plaques senils o contribueixin a la seva dissolució. Si bé aquest va ser el punt de partida d’aquest projecte, els múltiples factors implicats en la patogènesi de la MA fan que el clàssic paradigma d’ ¨una molècula, una diana¨ limiti la capacitat de la molècula de combatre aquesta malaltia tan complexa. Per tant, un esforç considerable s’ha dedicat al disseny d’agentsmultifuncionals que combatin els múltiples factors que caracteritzen el desenvolupament de la MA. En el present treball s’han dissenyat agents multifuncionals inspirats en dos esquelets moleculars ben establers i coneguts en el camp de la química medicinal: la tioflavina-T (ThT) i la deferiprona (DFP). La utilització de tècniques in silico que inclouen càlculs farmacocinètics i modelatge molecular ha estat un procés cabdal per a l’avaluació dels millors candidats en base als següents requeriments: (a) compliment de determinades propietats farmacocinètiques que estableixin el seu possible ús com a fàrmac (b) hidrofobicitat adequada per travessar la BBB i (c) interacció amb el pèptid Aen solució.
Resumo:
The information provided by the alignment-independent GRid Independent Descriptors (GRIND) can be condensed by the application of principal component analysis, obtaining a small number of principal properties (GRIND-PP), which is more suitable for describing molecular similarity. The objective of the present study is to optimize diverse parameters involved in the obtention of the GRIND-PP and validate their suitability for applications, requiring a biologically relevant description of the molecular similarity. With this aim, GRIND-PP computed with a collection of diverse settings were used to carry out ligand-based virtual screening (LBVS) on standard conditions. The quality of the results obtained was remarkable and comparable with other LBVS methods, and their detailed statistical analysis allowed to identify the method settings more determinant for the quality of the results and their optimum. Remarkably, some of these optimum settings differ significantly from those used in previously published applications, revealing their unexplored potential. Their applicability in large compound database was also explored by comparing the equivalence of the results obtained using either computed or projected principal properties. In general, the results of the study confirm the suitability of the GRIND-PP for practical applications and provide useful hints about how they should be computed for obtaining optimum results.
Resumo:
The emergence of the Web 2.0 technologies in the last years havechanged the way people interact with knowledge. Services for cooperation andcollaboration have placed the user in the centre of a new knowledge buildingspace. The development of new second generation learning environments canbenefit from the potential of these Web 2.0 services when applied to aneducational context. We propose a methodology for designing learningenvironments that relates Web 2.0 services with the functional requirements ofthese environments. In particular, we concentrate on the design of the KRSMsystem to discuss the components of this methodology and its application.