933 resultados para Functions of real variables
Resumo:
Services in smart environments pursue to increase the quality of people?s lives. The most important issues when developing this kind of environments is testing and validating such services. These tasks usually imply high costs and annoying or unfeasible real-world testing. In such cases, artificial societies may be used to simulate the smart environment (i.e. physical environment, equipment and humans). With this aim, the CHROMUBE methodology guides test engineers when modeling human beings. Such models reproduce behaviors which are highly similar to the real ones. Originally, these models are based on automata whose transitions are governed by random variables. Automaton?s structure and the probability distribution functions of each random variable are determined by a manual test and error process. In this paper, it is presented an alternative extension of this methodology which avoids the said manual process. It is based on learning human behavior patterns automatically from sensor data by using machine learning techniques. The presented approach has been tested on a real scenario, where this extension has given highly accurate human behavior models,
Resumo:
El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de la Internet de las Cosas, el comercio electrónico, las redes sociales, la telefonía móvil y la computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección y privacidad de la información y su contenido, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos o las comunicaciones electrónicas. Este hecho puede verse agravado por la falta de una frontera clara que delimite el mundo personal del mundo laboral en cuanto al acceso de la información. En todos estos campos de la actividad personal y laboral, la Criptografía ha jugado un papel fundamental aportando las herramientas necesarias para garantizar la confidencialidad, integridad y disponibilidad tanto de la privacidad de los datos personales como de la información. Por otro lado, la Biometría ha propuesto y ofrecido diferentes técnicas con el fin de garantizar la autentificación de individuos a través del uso de determinadas características personales como las huellas dáctilares, el iris, la geometría de la mano, la voz, la forma de caminar, etc. Cada una de estas dos ciencias, Criptografía y Biometría, aportan soluciones a campos específicos de la protección de datos y autentificación de usuarios, que se verían enormemente potenciados si determinadas características de ambas ciencias se unieran con vistas a objetivos comunes. Por ello es imperativo intensificar la investigación en estos ámbitos combinando los algoritmos y primitivas matemáticas de la Criptografía con la Biometría para dar respuesta a la demanda creciente de nuevas soluciones más técnicas, seguras y fáciles de usar que potencien de modo simultáneo la protección de datos y la identificacíón de usuarios. En esta combinación el concepto de biometría cancelable ha supuesto una piedra angular en el proceso de autentificación e identificación de usuarios al proporcionar propiedades de revocación y cancelación a los ragos biométricos. La contribución de esta tesis se basa en el principal aspecto de la Biometría, es decir, la autentificación segura y eficiente de usuarios a través de sus rasgos biométricos, utilizando tres aproximaciones distintas: 1. Diseño de un esquema criptobiométrico borroso que implemente los principios de la biometría cancelable para identificar usuarios lidiando con los problemas acaecidos de la variabilidad intra e inter-usuarios. 2. Diseño de una nueva función hash que preserva la similitud (SPHF por sus siglas en inglés). Actualmente estas funciones se usan en el campo del análisis forense digital con el objetivo de buscar similitudes en el contenido de archivos distintos pero similares de modo que se pueda precisar hasta qué punto estos archivos pudieran ser considerados iguales. La función definida en este trabajo de investigación, además de mejorar los resultados de las principales funciones desarrolladas hasta el momento, intenta extender su uso a la comparación entre patrones de iris. 3. Desarrollando un nuevo mecanismo de comparación de patrones de iris que considera tales patrones como si fueran señales para compararlos posteriormente utilizando la transformada de Walsh-Hadarmard. Los resultados obtenidos son excelentes teniendo en cuenta los requerimientos de seguridad y privacidad mencionados anteriormente. Cada uno de los tres esquemas diseñados han sido implementados para poder realizar experimentos y probar su eficacia operativa en escenarios que simulan situaciones reales: El esquema criptobiométrico borroso y la función SPHF han sido implementados en lenguaje Java mientras que el proceso basado en la transformada de Walsh-Hadamard en Matlab. En los experimentos se ha utilizado una base de datos de imágenes de iris (CASIA) para simular una población de usuarios del sistema. En el caso particular de la función de SPHF, además se han realizado experimentos para comprobar su utilidad en el campo de análisis forense comparando archivos e imágenes con contenido similar y distinto. En este sentido, para cada uno de los esquemas se han calculado los ratios de falso negativo y falso positivo. ABSTRACT The extraordinary increase of new information technologies, the development of Internet of Things, the electronic commerce, the social networks, mobile or smart telephony and cloud computing and storage, have provided great benefits in all areas of society. Besides this fact, there are new challenges for the protection and privacy of information and its content, such as the loss of confidentiality and integrity of electronic documents and communications. This is exarcebated by the lack of a clear boundary between the personal world and the business world as their differences are becoming narrower. In both worlds, i.e the personal and the business one, Cryptography has played a key role by providing the necessary tools to ensure the confidentiality, integrity and availability both of the privacy of the personal data and information. On the other hand, Biometrics has offered and proposed different techniques with the aim to assure the authentication of individuals through their biometric traits, such as fingerprints, iris, hand geometry, voice, gait, etc. Each of these sciences, Cryptography and Biometrics, provides tools to specific problems of the data protection and user authentication, which would be widely strengthen if determined characteristics of both sciences would be combined in order to achieve common objectives. Therefore, it is imperative to intensify the research in this area by combining the basics mathematical algorithms and primitives of Cryptography with Biometrics to meet the growing demand for more secure and usability techniques which would improve the data protection and the user authentication. In this combination, the use of cancelable biometrics makes a cornerstone in the user authentication and identification process since it provides revocable or cancelation properties to the biometric traits. The contributions in this thesis involve the main aspect of Biometrics, i.e. the secure and efficient authentication of users through their biometric templates, considered from three different approaches. The first one is designing a fuzzy crypto-biometric scheme using the cancelable biometric principles to take advantage of the fuzziness of the biometric templates at the same time that it deals with the intra- and inter-user variability among users without compromising the biometric templates extracted from the legitimate users. The second one is designing a new Similarity Preserving Hash Function (SPHF), currently widely used in the Digital Forensics field to find similarities among different files to calculate their similarity level. The function designed in this research work, besides the fact of improving the results of the two main functions of this field currently in place, it tries to expand its use to the iris template comparison. Finally, the last approach of this thesis is developing a new mechanism of handling the iris templates, considering them as signals, to use the Walsh-Hadamard transform (complemented with three other algorithms) to compare them. The results obtained are excellent taking into account the security and privacy requirements mentioned previously. Every one of the three schemes designed have been implemented to test their operational efficacy in situations that simulate real scenarios: The fuzzy crypto-biometric scheme and the SPHF have been implemented in Java language, while the process based on the Walsh-Hadamard transform in Matlab. The experiments have been performed using a database of iris templates (CASIA-IrisV2) to simulate a user population. The case of the new SPHF designed is special since previous to be applied i to the Biometrics field, it has been also tested to determine its applicability in the Digital Forensic field comparing similar and dissimilar files and images. The ratios of efficiency and effectiveness regarding user authentication, i.e. False Non Match and False Match Rate, for the schemes designed have been calculated with different parameters and cases to analyse their behaviour.
Resumo:
In the context of real-valued functions defined on metric spaces, it is known that the locally Lipschitz functions are uniformly dense in the continuous functions and that the Lipschitz in the small functions - the locally Lipschitz functions where both the local Lipschitz constant and the size of the neighborhood can be chosen independent of the point - are uniformly dense in the uniformly continuous functions. Between these two basic classes of continuous functions lies the class of Cauchy continuous functions, i.e., the functions that map Cauchy sequences in the domain to Cauchy sequences in the target space. Here, we exhibit an intermediate class of Cauchy continuous locally Lipschitz functions that is uniformly dense in the real-valued Cauchy continuous functions. In fact, our result is valid when our target space is an arbitrary Banach space.
Resumo:
In this paper, a novel approach for exploiting multitemporal remote sensing data focused on real-time monitoring of agricultural crops is presented. The methodology is defined in a dynamical system context using state-space techniques, which enables the possibility of merging past temporal information with an update for each new acquisition. The dynamic system context allows us to exploit classical tools in this domain to perform the estimation of relevant variables. A general methodology is proposed, and a particular instance is defined in this study based on polarimetric radar data to track the phenological stages of a set of crops. A model generation from empirical data through principal component analysis is presented, and an extended Kalman filter is adapted to perform phenological stage estimation. Results employing quad-pol Radarsat-2 data over three different cereals are analyzed. The potential of this methodology to retrieve vegetation variables in real time is shown.
Resumo:
Analisamos os determinantes de precificação de Certificados de Recebíveis Imobiliários (CRIs) com relação ao ativo objeto e níveis de garantias, controlando por variáveis de tamanho, prazo e rating. Verifica-se um prêmio médio adicional em CRIs de 1,0 p.p. quando comparados com debêntures de prazos semelhantes e de mesmo rating. A justificativa desse prêmio é analisada em duas frentes: (a) apesar de CRI seguir relativa padronização, encontramos que o papel pode representar diferentes níveis de risco e ativos-objeto; e (b) essa falta de padronização leva a níveis de precificação diferenciados por suas características específicas de riscos. Os diferentes níveis de risco são percebidos pelas diversas garantias utilizadas sendo que 41% das emissões possuem garantias pessoais de originadores (aval ou fiança). Conclui-se que existe, em geral, uma diferença de retornos positiva (o spread médio na emissão dos CRIs indexados à inflação foi de 321 bps superior à curva de juros de mercado), sendo mais preponderante a depender do segmento (prêmio para os segmentos residencial e loteamentos) e mitigado pelo nível de garantias oferecido. É possível verificar um prêmio médio de 1,4 p.p. para os segmentos residencial e de loteamentos. Algumas características das emissões foram analisadas como controle (tamanho, prazo e, por fim, das notas e origem da agência avaliadora de rating). Os CRIs de maior volume e maior prazo apresentam spreads menores. Quanto ao rating, os CRIs apresentam efeitos diversos a depender do segmento. Para CRIs residenciais, o efeito é positivo (redução de spread) caso a emissão seja avaliada por alguma agência de rating, enquanto que para os CRIs comerciais, o efeito é negativo. O efeito pode ser positivo para os CRIs comerciais (redução de spread) em caso de avaliação por agência de rating internacional ou possuir notas de rating superiores à nota ‘A’.
Resumo:
As part of the Governor's effort to streamline State government through improvements in the efficiency and effectiveness of operations, Executive Order 2004-06 ("EO6") provided for the reorganization (consolidation) of the Department of Insurance, Office of Banks and Real Estate, Department of Professional Regulation and Department of Financial Institutions. Through EO6 the four predecessor Agencies were abolished and a single new agency, The Department of Financial and Professional Regulation (hereafter referred to as "IDFPR") was created. The purpose of the consolidation of the four regulatory agencies was to allow for certain economies of scale to be realized primarily within the executive management and administrative functions. Additionally, the consolidation would increases the effectiveness of operations through the integration of certain duplicative functions within the four predecessor agencies without the denegration of the frontline functions. Beginning on or about July 1, 2004, the IDFPR began consolidation activities focusing primarily on the administrative functions of Executive Management, Fiscal and Accounting, General Counsel, Human Resources, Information Technology and Other Administrative Services. The underlying premise of the reorganization was that all improvements could be accomplished without the denegration of the frontline functions of the predecessor agencies. Accordingly, all powers, duties, rights, responsibilities and functions of the predecessor agencies migrated to IDFPR and the reorganization activities commenced July 1, 2004.
Resumo:
This study was concerned with the structure, functions and development, especially the performance, of some rural small firms associated with the Council for Small Industries in Rural Areas (C?SIRA) of England. Forty firms were used as the main basis of analysis. For some aspects of the investigation, however, data from another 54 firms, obtained indirectly through nine CoSIRA Organisers, were also used. For performance-analysis, the 40 firms were firstly ranked according to their growth and profitability rates which were calculated from their financial data. Then each of the variables hypothesised to be related to performance was tested to ascertain its relationship with performance, using the Spearman's Rank Correlation technique. The analysis indicated that each of the four factors .. the principal, the firm itself, its management, and the environment - had a bearing upon the performance of the firm. Within the first factor, the owner-manager's background and attitudes were found to be most important; in the second, the firm's size, age and scope of activities were also found to be correlated with performance; with respect to the third, it was revealed that firms which practised some forms of systems in planning, control and costing performed better than those which did not and, finally with respect to the fourth factor, it was found that some of the services provided by CoSIRA, especially credit finance, were facilitative to the firm's performance. Another significant facet of the firms highlighted by the study was their multifarious roles. These, meeting economic, psychological, sociological and political needs, were considered to be most useful to man and his society. Finally, the study has added light to the structural characteristics of the sampled firms, including various aspects of their development, orientation and organisation, as well as their various structural strengths and weakness. ' .
Resumo:
Isotropic scattering Raman spectra of liquid acetonitrile (AN) solutions of LiBF4 and NaI at various temperatures and concentrations have been investigated. For the first time imaginary as well as real parts of the solvent vibrational correlation functions have been extracted from the spectra. Such imaginary parts are currently an important component of modern theories of vibrational relaxation in liquids. This investigation thus provides the first experimental data on imaginary parts of a correlation function in AN solutions. Using the fitting algorithm we recently developed, statistically confident models for the Raman spectra were deduced. The parameters of the band shapes, with an additional correction, of the ν2 AN vibration (CN stretching), together with their confidence intervals are also reported for the first time. It is shown that three distinct species, with lifetimes greater than ∼10−13 s, of the AN molecules can be detected in solutions containing Li+ and Na+. These species are attributed to AN molecules directly solvating cations; the single oriented and polarised molecules interleaving the cation and anion of a Solvent Shared Ion Pair (SShIP); and molecules solvating anions. These last are considered to be equivalent to the next layer of solvent molecules, because the CN end of the molecule is distant from the anion and thus less affected by the ionic charge compared with the anion situation. Calculations showed that at the concentrations employed, 1 and 0.3 M, there were essentially no other solvent molecules remaining that could be considered as bulk solvent. Calculations also showed that the internuclear distance in these solutions supported the proposal that the ionic entity dominating in solution was the SShIP, and other evidence was adduced that confirmed the absence of Contact Ion Pairs at these concentrations. The parameters of the shape of the vibrational correlation functions of all three species are reported. The parameters of intramolecular anharmonic coupling between the potential surfaces in AN and the dynamics of the intermolecular environment fluctuations and intermolecular energy transfer are presented. These results will assist investigations made at higher and lower concentrations, when additional species and interactions with AN molecules will be present.
Resumo:
2000 Math. Subject Classification: 33E12, 65D20, 33F05, 30E15
Resumo:
Иво Й. Дамянов - Манипулирането на булеви функции е основнo за теоретичната информатика, в това число логическата оптимизация, валидирането и синтеза на схеми. В тази статия се разглеждат някои първоначални резултати относно връзката между граф-базираното представяне на булевите функции и свойствата на техните променливи.
Resumo:
Adjoint methods have proven to be an efficient way of calculating the gradient of an objective function with respect to a shape parameter for optimisation, with a computational cost nearly independent of the number of the design variables [1]. The approach in this paper links the adjoint surface sensitivities (gradient of objective function with respect to the surface movement) with the parametric design velocities (movement of the surface due to a CAD parameter perturbation) in order to compute the gradient of the objective function with respect to CAD variables.
For a successful implementation of shape optimization strategies in practical industrial cases, the choice of design variables or parameterisation scheme used for the model to be optimized plays a vital role. Where the goal is to base the optimization on a CAD model the choices are to use a NURBS geometry generated from CAD modelling software, where the position of the NURBS control points are the optimisation variables [2] or to use the feature based CAD model with all of the construction history to preserve the design intent [3]. The main advantage of using the feature based model is that the optimized model produced can be directly used for the downstream applications including manufacturing and process planning.
This paper presents an approach for optimization based on the feature based CAD model, which uses CAD parameters defining the features in the model geometry as the design variables. In order to capture the CAD surface movement with respect to the change in design variable, the “Parametric Design Velocity” is calculated, which is defined as the movement of the CAD model boundary in the normal direction due to a change in the parameter value.
The approach presented here for calculating the design velocities represents an advancement in terms of capability and robustness of that described by Robinson et al. [3]. The process can be easily integrated to most industrial optimisation workflows and is immune to the topology and labelling issues highlighted by other CAD based optimisation processes. It considers every continuous (“real value”) parameter type as an optimisation variable, and it can be adapted to work with any CAD modelling software, as long as it has an API which provides access to the values of the parameters which control the model shape and allows the model geometry to be exported. To calculate the movement of the boundary the methodology employs finite differences on the shape of the 3D CAD models before and after the parameter perturbation. The implementation procedure includes calculating the geometrical movement along a normal direction between two discrete representations of the original and perturbed geometry respectively. Parametric design velocities can then be directly linked with adjoint surface sensitivities to extract the gradients to use in a gradient-based optimization algorithm.
The optimisation of a flow optimisation problem is presented, in which the power dissipation of the flow in an automotive air duct is to be reduced by changing the parameters of the CAD geometry created in CATIA V5. The flow sensitivities are computed with the continuous adjoint method for a laminar and turbulent flow [4] and are combined with the parametric design velocities to compute the cost function gradients. A line-search algorithm is then used to update the design variables and proceed further with optimisation process.
Resumo:
Abstract The ultimate problem considered in this thesis is modeling a high-dimensional joint distribution over a set of discrete variables. For this purpose, we consider classes of context-specific graphical models and the main emphasis is on learning the structure of such models from data. Traditional graphical models compactly represent a joint distribution through a factorization justi ed by statements of conditional independence which are encoded by a graph structure. Context-speci c independence is a natural generalization of conditional independence that only holds in a certain context, speci ed by the conditioning variables. We introduce context-speci c generalizations of both Bayesian networks and Markov networks by including statements of context-specific independence which can be encoded as a part of the model structures. For the purpose of learning context-speci c model structures from data, we derive score functions, based on results from Bayesian statistics, by which the plausibility of a structure is assessed. To identify high-scoring structures, we construct stochastic and deterministic search algorithms designed to exploit the structural decomposition of our score functions. Numerical experiments on synthetic and real-world data show that the increased exibility of context-specific structures can more accurately emulate the dependence structure among the variables and thereby improve the predictive accuracy of the models.
Resumo:
320 p.
Resumo:
Logistic regression is a statistical tool widely used for predicting species’ potential distributions starting from presence/absence data and a set of independent variables. However, logistic regression equations compute probability values based not only on the values of the predictor variables but also on the relative proportion of presences and absences in the dataset, which does not adequately describe the environmental favourability for or against species presence. A few strategies have been used to circumvent this, but they usually imply an alteration of the original data or the discarding of potentially valuable information. We propose a way to obtain from logistic regression an environmental favourability function whose results are not affected by an uneven proportion of presences and absences. We tested the method on the distribution of virtual species in an imaginary territory. The favourability models yielded similar values regardless of the variation in the presence/absence ratio. We also illustrate with the example of the Pyrenean desman’s (Galemys pyrenaicus) distribution in Spain. The favourability model yielded more realistic potential distribution maps than the logistic regression model. Favourability values can be regarded as the degree of membership of the fuzzy set of sites whose environmental conditions are favourable to the species, which enables applying the rules of fuzzy logic to distribution modelling. They also allow for direct comparisons between models for species with different presence/absence ratios in the study area. This makes themmore useful to estimate the conservation value of areas, to design ecological corridors, or to select appropriate areas for species reintroductions.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física