936 resultados para Functions of Use
Resumo:
El extraordinario auge de las nuevas tecnologías de la información, el desarrollo de la Internet de las Cosas, el comercio electrónico, las redes sociales, la telefonía móvil y la computación y almacenamiento en la nube, han proporcionado grandes beneficios en todos los ámbitos de la sociedad. Junto a éstos, se presentan nuevos retos para la protección y privacidad de la información y su contenido, como la suplantación de personalidad y la pérdida de la confidencialidad e integridad de los documentos o las comunicaciones electrónicas. Este hecho puede verse agravado por la falta de una frontera clara que delimite el mundo personal del mundo laboral en cuanto al acceso de la información. En todos estos campos de la actividad personal y laboral, la Criptografía ha jugado un papel fundamental aportando las herramientas necesarias para garantizar la confidencialidad, integridad y disponibilidad tanto de la privacidad de los datos personales como de la información. Por otro lado, la Biometría ha propuesto y ofrecido diferentes técnicas con el fin de garantizar la autentificación de individuos a través del uso de determinadas características personales como las huellas dáctilares, el iris, la geometría de la mano, la voz, la forma de caminar, etc. Cada una de estas dos ciencias, Criptografía y Biometría, aportan soluciones a campos específicos de la protección de datos y autentificación de usuarios, que se verían enormemente potenciados si determinadas características de ambas ciencias se unieran con vistas a objetivos comunes. Por ello es imperativo intensificar la investigación en estos ámbitos combinando los algoritmos y primitivas matemáticas de la Criptografía con la Biometría para dar respuesta a la demanda creciente de nuevas soluciones más técnicas, seguras y fáciles de usar que potencien de modo simultáneo la protección de datos y la identificacíón de usuarios. En esta combinación el concepto de biometría cancelable ha supuesto una piedra angular en el proceso de autentificación e identificación de usuarios al proporcionar propiedades de revocación y cancelación a los ragos biométricos. La contribución de esta tesis se basa en el principal aspecto de la Biometría, es decir, la autentificación segura y eficiente de usuarios a través de sus rasgos biométricos, utilizando tres aproximaciones distintas: 1. Diseño de un esquema criptobiométrico borroso que implemente los principios de la biometría cancelable para identificar usuarios lidiando con los problemas acaecidos de la variabilidad intra e inter-usuarios. 2. Diseño de una nueva función hash que preserva la similitud (SPHF por sus siglas en inglés). Actualmente estas funciones se usan en el campo del análisis forense digital con el objetivo de buscar similitudes en el contenido de archivos distintos pero similares de modo que se pueda precisar hasta qué punto estos archivos pudieran ser considerados iguales. La función definida en este trabajo de investigación, además de mejorar los resultados de las principales funciones desarrolladas hasta el momento, intenta extender su uso a la comparación entre patrones de iris. 3. Desarrollando un nuevo mecanismo de comparación de patrones de iris que considera tales patrones como si fueran señales para compararlos posteriormente utilizando la transformada de Walsh-Hadarmard. Los resultados obtenidos son excelentes teniendo en cuenta los requerimientos de seguridad y privacidad mencionados anteriormente. Cada uno de los tres esquemas diseñados han sido implementados para poder realizar experimentos y probar su eficacia operativa en escenarios que simulan situaciones reales: El esquema criptobiométrico borroso y la función SPHF han sido implementados en lenguaje Java mientras que el proceso basado en la transformada de Walsh-Hadamard en Matlab. En los experimentos se ha utilizado una base de datos de imágenes de iris (CASIA) para simular una población de usuarios del sistema. En el caso particular de la función de SPHF, además se han realizado experimentos para comprobar su utilidad en el campo de análisis forense comparando archivos e imágenes con contenido similar y distinto. En este sentido, para cada uno de los esquemas se han calculado los ratios de falso negativo y falso positivo. ABSTRACT The extraordinary increase of new information technologies, the development of Internet of Things, the electronic commerce, the social networks, mobile or smart telephony and cloud computing and storage, have provided great benefits in all areas of society. Besides this fact, there are new challenges for the protection and privacy of information and its content, such as the loss of confidentiality and integrity of electronic documents and communications. This is exarcebated by the lack of a clear boundary between the personal world and the business world as their differences are becoming narrower. In both worlds, i.e the personal and the business one, Cryptography has played a key role by providing the necessary tools to ensure the confidentiality, integrity and availability both of the privacy of the personal data and information. On the other hand, Biometrics has offered and proposed different techniques with the aim to assure the authentication of individuals through their biometric traits, such as fingerprints, iris, hand geometry, voice, gait, etc. Each of these sciences, Cryptography and Biometrics, provides tools to specific problems of the data protection and user authentication, which would be widely strengthen if determined characteristics of both sciences would be combined in order to achieve common objectives. Therefore, it is imperative to intensify the research in this area by combining the basics mathematical algorithms and primitives of Cryptography with Biometrics to meet the growing demand for more secure and usability techniques which would improve the data protection and the user authentication. In this combination, the use of cancelable biometrics makes a cornerstone in the user authentication and identification process since it provides revocable or cancelation properties to the biometric traits. The contributions in this thesis involve the main aspect of Biometrics, i.e. the secure and efficient authentication of users through their biometric templates, considered from three different approaches. The first one is designing a fuzzy crypto-biometric scheme using the cancelable biometric principles to take advantage of the fuzziness of the biometric templates at the same time that it deals with the intra- and inter-user variability among users without compromising the biometric templates extracted from the legitimate users. The second one is designing a new Similarity Preserving Hash Function (SPHF), currently widely used in the Digital Forensics field to find similarities among different files to calculate their similarity level. The function designed in this research work, besides the fact of improving the results of the two main functions of this field currently in place, it tries to expand its use to the iris template comparison. Finally, the last approach of this thesis is developing a new mechanism of handling the iris templates, considering them as signals, to use the Walsh-Hadamard transform (complemented with three other algorithms) to compare them. The results obtained are excellent taking into account the security and privacy requirements mentioned previously. Every one of the three schemes designed have been implemented to test their operational efficacy in situations that simulate real scenarios: The fuzzy crypto-biometric scheme and the SPHF have been implemented in Java language, while the process based on the Walsh-Hadamard transform in Matlab. The experiments have been performed using a database of iris templates (CASIA-IrisV2) to simulate a user population. The case of the new SPHF designed is special since previous to be applied i to the Biometrics field, it has been also tested to determine its applicability in the Digital Forensic field comparing similar and dissimilar files and images. The ratios of efficiency and effectiveness regarding user authentication, i.e. False Non Match and False Match Rate, for the schemes designed have been calculated with different parameters and cases to analyse their behaviour.
Resumo:
Recently, a new method to analyze biological nonstationary stochastic variables has been presented. The method is especially suitable to analyze the variation of one biological variable with respect to changes of another variable. Here, it is illustrated by the change of the pulmonary blood pressure in response to a step change of oxygen concentration in the gas that an animal breathes. The pressure signal is resolved into the sum of a set of oscillatory intrinsic mode functions, which have zero “local mean,” and a final nonoscillatory mode. With this device, we obtain a set of “mean trends,” each of which represents a “mean” in a definitive sense, and together they represent the mean trend systematically with different degrees of oscillatory content. Correspondingly, the oscillatory content of the signal about any mean trend can be represented by a set of partial sums of intrinsic mode functions. When the concept of “indicial response function” is used to describe the change of one variable in response to a step change of another variable, we now have a set of indicial response functions of the mean trends and another set of indicial response functions to describe the energy or intensity of oscillations about each mean trend. Each of these can be represented by an analytic function whose coefficients can be determined by a least-squares curve-fitting procedure. In this way, experimental results are stated sharply by analytic functions.
Resumo:
p53 is a multifunctional tumor suppressor protein involved in the negative control of cell growth. Mutations in p53 cause alterations in cellular phenotype, including immortalization, neoplastic transformation, and resistance to DNA-damaging drugs. To help dissect distinct functions of p53, a set of genetic suppressor elements (GSEs) capable of inducing different p53-related phenotypes in rodent embryo fibroblasts was isolated from a retroviral library of random rat p53 cDNA fragments. All the GSEs were 100-300 nucleotides long and were in the sense orientation. They fell into four classes, corresponding to the transactivator (class I), DNA-binding (class II), and C-terminal (class III) domains of the protein and the 3'-untranslated region of the mRNA (class IV). GSEs in all four classes promoted immortalization of primary cells, but only members of classes I and III cooperated with activated ras to transform cells, and only members of class III conferred resistance to etoposide and strongly inhibited transcriptional transactivation by p53. These observations suggest that processes related to control of senescence, response to DNA damage, and transformation involve different functions of the p53 protein and furthermore indicate a regulatory role for the 3'-untranslated region of p53 mRNA.
Resumo:
Purpose: To evaluate parent use of functional communication training (FCT) to replace and enhance prelinguistic behaviours in six young children with developmental and physical disabilities. Method: Initially, the communicative functions of the children's prelinguistic behaviours were assessed by parent interviews. Three communication functions were identified for each child and intervention goals to replace or enhance the child's existing prelinguistic behaviours were developed in consultation with parents. After a baseline phase, parents received training on implementation of FCT. Intervention was staggered across the three communicative functions in a multiple-probe design. Results: Intervention was associated with increases in the replacement communication behaviour. Treatment gains were generally maintained at the monthly follow-ups. Conclusion: The results suggest that parents can use FCT to enhance communication skills in children with developmental and physical disabilities.
Resumo:
Manufacturing firms are driven by competitive pressures to continually improve the effectiveness and efficiency of their organisations. For this reason, manufacturing engineers often implement changes to existing processes, or design new production facilities, with the expectation of making further gains in manufacturing system performance. This thesis relates to how the likely outcome of this type of decision should be predicted prior to its implementation. The thesis argues that since manufacturing systems must also interact with many other parts of an organisation, the expected performance improvements can often be significantly hampered by constraints that arise elsewhere in the business. As a result, decision-makers should attempt to predict just how well a proposed design will perform when these other factors, or 'support departments', are taken into consideration. However, the thesis also demonstrates that, in practice, where quantitative analysis is used to evaluate design decisions, the analysis model invariably ignores the potential impact of support functions on a system's overall performance. A more comprehensive modelling approach is therefore required. A study of how various business functions interact establishes that to properly represent the kind of delays that give rise to support department constraints, a model should actually portray the dynamic and stochastic behaviour of entities in both the manufacturing and non-manufacturing aspects of a business. This implies that computer simulation be used to model design decisions but current simulation software does not provide a sufficient range of functionality to enable the behaviour of all of these entities to be represented in this way. The main objective of the research has therefore been the development of a new simulator that will overcome limitations of existing software and so enable decision-makers to conduct a more holistic evaluation of design decisions. It is argued that the application of object-oriented techniques offers a potentially better way of fulfilling both the functional and ease-of-use issues relating to development of the new simulator. An object-oriented analysis and design of the system, called WBS/Office, are therefore presented that extends to modelling a firm's administrative and other support activities in the context of the manufacturing system design process. A particularly novel feature of the design is the ability for decision-makers to model how a firm's specific information and document processing requirements might hamper shop-floor performance. The simulator is primarily intended for modelling make-to-order batch manufacturing systems and the thesis presents example models created using a working version of WBS/Office that demonstrate the feasibility of using the system to analyse manufacturing system designs in this way.
Resumo:
OBJECTIVES: The objective of this research was to design a clinical decision support system (CDSS) that supports heterogeneous clinical decision problems and runs on multiple computing platforms. Meeting this objective required a novel design to create an extendable and easy to maintain clinical CDSS for point of care support. The proposed solution was evaluated in a proof of concept implementation. METHODS: Based on our earlier research with the design of a mobile CDSS for emergency triage we used ontology-driven design to represent essential components of a CDSS. Models of clinical decision problems were derived from the ontology and they were processed into executable applications during runtime. This allowed scaling applications' functionality to the capabilities of computing platforms. A prototype of the system was implemented using the extended client-server architecture and Web services to distribute the functions of the system and to make it operational in limited connectivity conditions. RESULTS: The proposed design provided a common framework that facilitated development of diversified clinical applications running seamlessly on a variety of computing platforms. It was prototyped for two clinical decision problems and settings (triage of acute pain in the emergency department and postoperative management of radical prostatectomy on the hospital ward) and implemented on two computing platforms-desktop and handheld computers. CONCLUSIONS: The requirement of the CDSS heterogeneity was satisfied with ontology-driven design. Processing of application models described with the help of ontological models allowed having a complex system running on multiple computing platforms with different capabilities. Finally, separation of models and runtime components contributed to improved extensibility and maintainability of the system.
Resumo:
Few studies have evaluated the profile of use of disease modifying drugs (DMD) in Brazilian patients with spondyloarthritis (SpA). A common research protocol was applied prospectively in 1505 patients classified as SpA by criteria of the European Spondyloarthropathies Study Group (ESSG), followed at 29 referral centers in Rheumatology in Brazil. Demographic and clinical variables were obtained and evaluated, by analyzing their correlation with the use of DMDs methotrexate (MTX) and sulfasalazine (SSZ). At least one DMD was used by 73.6% of patients: MTX by 29.2% and SSZ by 21.7%, while 22.7% used both drugs. The use of MTX was significantly associated with peripheral involvement, and SSZ was associated with axial involvement, and the two drugs were more administered, separately or in combination, in the mixed involvement (p < 0.001). The use of a DMD was significantly associated with Caucasian ethnicity (MTX , p = 0.014), inflammatory back pain (SSZ, p = 0.002) , buttock pain (SSZ, p = 0.030), neck pain (MTX, p = 0.042), arthritis of the lower limbs (MTX, p < 0.001), arthritis of the upper limbs (MTX, p < 0.001), enthesitis (p = 0.007), dactylitis (MTX, p < 0.001), inflammatory bowel disease (SSZ, p < 0.001) and nail involvement (MTX, p < 0.001). The use of at least one DMD was reported by more than 70% of patients in a large cohort of Brazilian patients with SpA, with MTX use more associated with peripheral involvement and the use of SSZ more associated with axial involvement.
Resumo:
Hsp90 is a molecular chaperone essential for cell viability in eukaryotes that is associated with the maturation of proteins involved in important cell functions and implicated in the stabilization of the tumor phenotype of various cancers, making this chaperone a notably interesting therapeutic target. Celastrol is a plant-derived pentacyclic triterpenoid compound with potent antioxidant, anti-inflammatory and anticancer activities; however, celastrol's action mode is still elusive. In this work, we investigated the effect of celastrol on the conformational and functional aspects of Hsp90α. Interestingly, celastrol appeared to target Hsp90α directly as the compound induced the oligomerization of the chaperone via the C-terminal domain as demonstrated by experiments using a deletion mutant. The nature of the oligomers was investigated by biophysical tools demonstrating that a two-fold excess of celastrol induced the formation of a decameric Hsp90α bound throughout the C-terminal domain. When bound, celastrol destabilized the C-terminal domain. Surprisingly, standard chaperone functional investigations demonstrated that neither the in vitro chaperone activity of protecting against aggregation nor the ability to bind a TPR co-chaperone, which binds to the C-terminus of Hsp90α, were affected by celastrol. Celastrol interferes with specific biological functions of Hsp90α. Our results suggest a model in which celastrol binds directly to the C-terminal domain of Hsp90α causing oligomerization. However, the ability to protect against protein aggregation (supported by our results) and to bind to TPR co-chaperones are not affected by celastrol. Therefore celastrol may act primarily by inducing specific oligomerization that affects some, but not all, of the functions of Hsp90α. To the best of our knowledge, this study is the first work to use multiple probes to investigate the effect that celastrol has on the stability and oligomerization of Hsp90α and on the binding of this chaperone to Tom70. This work provides a novel mechanism by which celastrol binds Hsp90α.
Resumo:
The objective of this study was to verify factors associated with the use of medication by adults, with emphasis on the differences between men and women. It was a population-based, cross-sectional study with cluster sampling conducted in two stages in Campinas in the state of São Paulo in 2008. Among the 2,413 individuals aged 20 or older, the prevalence of use of at least one drug in the three days before the research was 45.4% (95% CI: 41.3 - 49.4) in men and 64.6% (95% CI: 59.8 - 69.2) in women. For adult men over 40 years old who were not working, former smokers, with one or more chronic diseases, with two or more health problems and who sought health care or a health professional in the two weeks preceding the research showed higher prevalence of medication use. Among women, a higher prevalence of use was observed in females over 40, obese, former smokers, who reported a short sleep pattern, with one or more chronic diseases and two or more health problems, and who reported seeking a health care service or professional in the past 15 days. The findings showed some differences in the determinants of drug use in relation to gender, revealing the greater importance of health-related behavior among women.
Resumo:
There is an urgent need to make drug discovery cheaper and faster. This will enable the development of treatments for diseases currently neglected for economic reasons, such as tropical and orphan diseases, and generally increase the supply of new drugs. Here, we report the Robot Scientist 'Eve' designed to make drug discovery more economical. A Robot Scientist is a laboratory automation system that uses artificial intelligence (AI) techniques to discover scientific knowledge through cycles of experimentation. Eve integrates and automates library-screening, hit-confirmation, and lead generation through cycles of quantitative structure activity relationship learning and testing. Using econometric modelling we demonstrate that the use of AI to select compounds economically outperforms standard drug screening. For further efficiency Eve uses a standardized form of assay to compute Boolean functions of compound properties. These assays can be quickly and cheaply engineered using synthetic biology, enabling more targets to be assayed for a given budget. Eve has repositioned several drugs against specific targets in parasites that cause tropical diseases. One validated discovery is that the anti-cancer compound TNP-470 is a potent inhibitor of dihydrofolate reductase from the malaria-causing parasite Plasmodium vivax.
Resumo:
With the increase in life expectancy, biomaterials have become an increasingly important focus of research because they are used to replace parts and functions of the human body, thus contributing to improved quality of life. In the development of new biomaterials, the Ti-15Mo alloy is particularly significant. In this study, the Ti-15Mo alloy was produced using an arc-melting furnace and then characterized by density, X-ray diffraction, optical microscopy, hardness and dynamic elasticity modulus measurements, and cytotoxicity tests. The microstructure was obtained with β predominance. Microhardness, elasticity modulus, and cytotoxicity testing results showed that this material has great potential for use as biomaterial, mainly in orthopedic applications.
Resumo:
We study the evolution of dense clumps and provide an argument that the existence of the clumps is not limited by their crossing times. We claim that the lifetimes of the clumps are determined by turbulent motions on a larger scale, and we predict the correlation of clump lifetime with column density. We use numerical simulations to successfully test this relation. In addition, we study the morphological asymmetry and the magnetization of the clumps as functions of their masses.
Resumo:
The properties of recycled aggregate produced from mixed (masonry and concrete) construction and demolition (C&D) waste are highly variable, and this restricts the use of such aggregate in structural concrete production. The development of classification techniques capable of reducing this variability is instrumental for quality control purposes and the production of high quality C&D aggregate. This paper investigates how the classification of C&D mixed coarse aggregate according to porosity influences the mechanical performance of concrete. Concretes using a variety of C&D aggregate porosity classes and different water/cement ratios were produced and the mechanical properties measured. For concretes produced with constant volume fractions of water, cement, natural sand and coarse aggregate from recycled mixed C&D waste, the compressive strength and Young modulus are direct exponential functions of the aggregate porosity. Sink and float technique is a simple laboratory density separation tool that facilitates the separation of cement particles with lower porosity, a difficult task when done only by visual sorting. For this experiment, separation using a 2.2 kg/dmA(3) suspension produced recycled aggregate (porosity less than 17%) which yielded good performance in concrete production. Industrial gravity separators may lead to the production of high quality recycled aggregate from mixed C&D waste for structural concrete applications.
Resumo:
We present models for the optical functions of 11 metals used as mirrors and contacts in optoelectronic and optical devices: noble metals (Ag, Au, Cu), aluminum, beryllium, and transition metals (Cr, Ni, Pd, Pt, Ti, W). We used two simple phenomenological models, the Lorentz-Drude (LD) and the Brendel-Bormann (BB), to interpret both the free-electron and the interband parts of the dielectric response of metals in a wide spectral range from 0.1 to 6 eV. Our results show that the BE model was needed to describe appropriately the interband absorption in noble metals, while for Al, Be, and the transition metals both models exhibit good agreement with the experimental data. A comparison with measurements on surface normal structures confirmed that the reflectance and the phase change on reflection from semiconductor-metal interfaces (including the case of metallic multilayers) can be accurately described by use of the proposed models for the optical functions of metallic films and the matrix method for multilayer calculations. (C) 1998 Optical Society of America.
Resumo:
The integral of the Wigner function of a quantum-mechanical system over a region or its boundary in the classical phase plane, is called a quasiprobability integral. Unlike a true probability integral, its value may lie outside the interval [0, 1]. It is characterized by a corresponding selfadjoint operator, to be called a region or contour operator as appropriate, which is determined by the characteristic function of that region or contour. The spectral problem is studied for commuting families of region and contour operators associated with concentric discs and circles of given radius a. Their respective eigenvalues are determined as functions of a, in terms of the Gauss-Laguerre polynomials. These polynomials provide a basis of vectors in a Hilbert space carrying the positive discrete series representation of the algebra su(1, 1) approximate to so(2, 1). The explicit relation between the spectra of operators associated with discs and circles with proportional radii, is given in terms of the discrete variable Meixner polynomials.