816 resultados para penalty-based aggregation functions
Resumo:
The thesis entitled Personnel Management Practices in the Kerala-Based Scheduled Commercial Banks. Personnel management function is of cardinal importance, requiring a sophisticated and scientific approach. In a labour-intensive, service industry like banking. Productivity and ultimate profitability of the entire organization depend considerably on the effectiveness with which personnel management function is executed; and the prudence with which personnel problems are handle. The main objectives of the study are to understand the current status of personnel management functions in the banks and to evaluate the practices in the light of the principles and theories of personnel management so as to identify the strengths and weaknesses. The universe of this study is the eight Scheduled Commercial Banks based in Kerala. The major limitation of the study is that as State Bank of Travancore, the lone public sector bank based in Kerala did not grant permission for collection of data, this study had to be confined to private sector banks only. Almost the entire data used for this study are primary and were collected from the files and other records or the concerned banks. This report has chapters dealing with the functional areas of personnel management such as determination of human resource requirements, recruitment and selection, training and development, performance appraisal, promotions and compensation. Findings reveal that the practice of personnel management in the Kerala-based private sector scheduled commercial banks has not gained a degree of sophistication compatible with its role in modern business management.
Resumo:
Quantile functions are efficient and equivalent alternatives to distribution functions in modeling and analysis of statistical data (see Gilchrist, 2000; Nair and Sankaran, 2009). Motivated by this, in the present paper, we introduce a quantile based Shannon entropy function. We also introduce residual entropy function in the quantile setup and study its properties. Unlike the residual entropy function due to Ebrahimi (1996), the residual quantile entropy function determines the quantile density function uniquely through a simple relationship. The measure is used to define two nonparametric classes of distributions
Resumo:
Partial moments are extensively used in literature for modeling and analysis of lifetime data. In this paper, we study properties of partial moments using quantile functions. The quantile based measure determines the underlying distribution uniquely. We then characterize certain lifetime quantile function models. The proposed measure provides alternate definitions for ageing criteria. Finally, we explore the utility of the measure to compare the characteristics of two lifetime distributions
Resumo:
The standard separable two dimensional wavelet transform has achieved a great success in image denoising applications due to its sparse representation of images. However it fails to capture efficiently the anisotropic geometric structures like edges and contours in images as they intersect too many wavelet basis functions and lead to a non-sparse representation. In this paper a novel de-noising scheme based on multi directional and anisotropic wavelet transform called directionlet is presented. The image denoising in wavelet domain has been extended to the directionlet domain to make the image features to concentrate on fewer coefficients so that more effective thresholding is possible. The image is first segmented and the dominant direction of each segment is identified to make a directional map. Then according to the directional map, the directionlet transform is taken along the dominant direction of the selected segment. The decomposed images with directional energy are used for scale dependent subband adaptive optimal threshold computation based on SURE risk. This threshold is then applied to the sub-bands except the LLL subband. The threshold corrected sub-bands with the unprocessed first sub-band (LLL) are given as input to the inverse directionlet algorithm for getting the de-noised image. Experimental results show that the proposed method outperforms the standard wavelet-based denoising methods in terms of numeric and visual quality
Resumo:
The basic thermodynamic functions, the entropy, free energy, and enthalpy, for element 105 (hahnium) in electronic configurations d^3 s^2, d^3 sp, and d^4s^1 and for its +5 ionized state (5f^14) have been calculated as a function of temperature. The data are based on the results of the calculations of the corresponding electronic states of element 105 using the multiconfiguration Dirac-Fock method.
Resumo:
This thesis investigates a method for human-robot interaction (HRI) in order to uphold productivity of industrial robots like minimization of the shortest operation time, while ensuring human safety like collision avoidance. For solving such problems an online motion planning approach for robotic manipulators with HRI has been proposed. The approach is based on model predictive control (MPC) with embedded mixed integer programming. The planning strategies of the robotic manipulators mainly considered in the thesis are directly performed in the workspace for easy obstacle representation. The non-convex optimization problem is approximated by a mixed-integer program (MIP). It is further effectively reformulated such that the number of binary variables and the number of feasible integer solutions are drastically decreased. Safety-relevant regions, which are potentially occupied by the human operators, can be generated online by a proposed method based on hidden Markov models. In contrast to previous approaches, which derive predictions based on probability density functions in the form of single points, such as most likely or expected human positions, the proposed method computes safety-relevant subsets of the workspace as a region which is possibly occupied by the human at future instances of time. The method is further enhanced by combining reachability analysis to increase the prediction accuracy. These safety-relevant regions can subsequently serve as safety constraints when the motion is planned by optimization. This way one arrives at motion plans that are safe, i.e. plans that avoid collision with a probability not less than a predefined threshold. The developed methods have been successfully applied to a developed demonstrator, where an industrial robot works in the same space as a human operator. The task of the industrial robot is to drive its end-effector according to a nominal sequence of grippingmotion-releasing operations while no collision with a human arm occurs.
Resumo:
Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.
Resumo:
This paper describes a trainable system capable of tracking faces and facialsfeatures like eyes and nostrils and estimating basic mouth features such as sdegrees of openness and smile in real time. In developing this system, we have addressed the twin issues of image representation and algorithms for learning. We have used the invariance properties of image representations based on Haar wavelets to robustly capture various facial features. Similarly, unlike previous approaches this system is entirely trained using examples and does not rely on a priori (hand-crafted) models of facial features based on optical flow or facial musculature. The system works in several stages that begin with face detection, followed by localization of facial features and estimation of mouth parameters. Each of these stages is formulated as a problem in supervised learning from examples. We apply the new and robust technique of support vector machines (SVM) for classification in the stage of skin segmentation, face detection and eye detection. Estimation of mouth parameters is modeled as a regression from a sparse subset of coefficients (basis functions) of an overcomplete dictionary of Haar wavelets.
Resumo:
Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities
Resumo:
In this paper, we address this problem through the design of a semiactive controller based on the mixed H2/H∞ control theory. The vibrations caused by the seismic motions are mitigated by a semiactive damper installed in the bottom of the structure. It is meant by semiactive damper, a device that absorbs but cannot inject energy into the system. Sufficient conditions for the design of a desired control are given in terms of linear matrix inequalities (LMIs). A controller that guarantees asymptotic stability and a mixed H2/H∞ performance is then developed. An algorithm is proposed to handle the semiactive nature of the actuator. The performance of the controller is experimentally evaluated in a real-time hybrid testing facility that consists of a physical specimen (a small-scale magnetorheological damper) and a numerical model (a large-scale three-story building)
Resumo:
The aim of this paper is to provide an estimation and decomposition of the motherhood wage penalty in Colombia. Our empirical strategy was based on the matching procedure designed by Ñopo (The Review of Economics and Statistics, 90(2), 290–299, 2008a ) for the case of gender wage gaps. This is an alternative procedure to the well-known Blinder–Oaxaca decomposition method. Using the cross-sectional data of the Colombian Living Standard Survey, the wage gap was decomposed into four components, according to the characteristics of mothers and non-mothers. Three of the components are explained by differences in observable characteristics of women, while the other is the unexplained part of the gap. We found that mothers earn, on average, 1.73 % less than their counterparts without children and that this gap slightly decreased when the group included older women. It is observed from the results that, once schooling was included as a matching variable, the unexplained part of the gap considerably decreased and became non-significant. Thus, we did not find evidence of wage discrimination against mothers in the Colombian labor market. Copyright Springer Science+Business Media New York 2013
Resumo:
We report here a new empirical density functional that is constructed based on the performance of OPBE and PBE for spin states and SN 2 reaction barriers and how these are affected by different regions of the reduced gradient expansion. In a previous study [Swart, Sol̀, and Bickelhaupt, J. Comput. Methods Sci. Eng. 9, 69 (2009)] we already reported how, by switching between OPBE and PBE, one could obtain both the good performance of OPBE for spin states and reaction barriers and that of PBE for weak interactions within one and the same (SSB-sw) functional. Here we fine tuned this functional and include a portion of the KT functional and Grimme's dispersion correction to account for π- π stacking. Our new SSB-D functional is found to be a clear improvement and functions very well for biological applications (hydrogen bonding, π -π stacking, spin-state splittings, accuracy of geometries, reaction barriers)
Resumo:
Quality management Self-evaluation of the organisation Citizens/customers satisfaction Impact on society evaluation Key performance evaluation Good practices comparison (Benchmarking) Continuous improvement In professional environments, when quality assessment of museums is discussed, one immediately thinks of the honourableness of the directors and curators, the erudition and specialisation of knowledge, the diversity of the gathered material and study of the collections, the collections conservation methods and environmental control, the regularity and notoriety of the exhibitions and artists, the building’s architecture and site, the recreation of environments, the museographic equipment design. We admit that the roles and attributes listed above can contribute to the definition of a specificity of museological good practice within a hierarchised functional perspective (the museum functions) and for the classification of museums according to a scale, validated between peers, based on “installed” appreciation criteria, enforced from above downwards, according to the “prestige” of the products and of those who conceive them, but that say nothing about the effective satisfaction of the citizen/customers and the real impact on society. There is a lack of evaluation instruments that would give us a return of all that the museum is and represents in contemporary society, focused on being and on the relation with the other, in detriment of the ostentatious possession and of the doing in order to meet one’s duties. But it is only possible to evaluate something by measurement and comparison, on the basis of well defined criteria, from a common grid, implicating all of the actors in the self-evaluation, in the definition of the aims to fulfil and in the obtaining of results.
Resumo:
The formation of hydrogen-bonded interpolymer complexes between poly(acrylic acid) and poly(N-vinyl pyrrolidone) as well as amphiphilic copolymers of N-vinyl pyrrolidone with vinyl propyl ether has been studied in aqueous and organic solutions. It was demonstrated that introduction of vinyl propyl ether units into the macromolecules of the nonionic polymer enhances their ability to form complexes in aqueous solutions due to more significant contribution of hydrophobic effects. The complexation was found to be a multistage process that involves the formation of primary polycomplex particles, which further aggregate to form spherical nanoparticles. Depending on the environmental factors (pH, solvent nature), these nanoparticles may either form stable colloidal solutions or undergo further aggregation, resulting in precipitation of interpolymer complexes. In organic solvents, the intensity of complex formation increases in the following order: methanol < ethanol < isopropanol < dioxane. The multilayered coatings were developed using layer-by-layer deposition of interpolymer complexes on glass surfaces. It was demonstrated that the solvent nature affects the efficiency of coating deposition.
Resumo:
Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.