716 resultados para Mathematical computing
Resumo:
One of the major problems when using non-dedicated volunteer resources in adistributed network is the high volatility of these hosts since they can go offlineor become unavailable at any time without control. Furthermore, the use ofvolunteer resources implies some security issues due to the fact that they aregenerally anonymous entities which we know nothing about. So, how to trustin someone we do not know?.Over the last years an important number of reputation-based trust solutionshave been designed to evaluate the participants' behavior in a system.However, most of these solutions are addressed to P2P and ad-hoc mobilenetworks that may not fit well with other kinds of distributed systems thatcould take advantage of volunteer resources as recent cloud computinginfrastructures.In this paper we propose a first approach to design an anonymous reputationmechanism for CoDeS [1], a middleware for building fogs where deployingservices using volunteer resources. The participants are reputation clients(RC), a reputation authority (RA) and a certification authority (CA). Users needa valid public key certificate from the CA to register to the RA and obtain thedata needed to participate into the system, as now an opaque identifier thatwe call here pseudonym and an initial reputation value that users provide toother users when interacting together. The mechanism prevents not only themanipulation of the provided reputation values but also any disclosure of theusers' identities to any other users or authorities so the anonymity isguaranteed.
Resumo:
In this work, we have developed the first free software for mobile devices with the Android operating system that can preventively mitigate the number of contagions of sexually transmitted infections (STI), associated with risk behavior. This software runs in two modes. The normal mode allows the user to see the alerts and nearby health centers. The second mode enables the service to work in the background. This software reports the health risks, as well as the location of different test centers.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
Despite their limited proliferation capacity, regulatory T cells (T(regs)) constitute a population maintained over the entire lifetime of a human organism. The means by which T(regs) sustain a stable pool in vivo are controversial. Using a mathematical model, we address this issue by evaluating several biological scenarios of the origins and the proliferation capacity of two subsets of T(regs): precursor CD4(+)CD25(+)CD45RO(-) and mature CD4(+)CD25(+)CD45RO(+) cells. The lifelong dynamics of T(regs) are described by a set of ordinary differential equations, driven by a stochastic process representing the major immune reactions involving these cells. The model dynamics are validated using data from human donors of different ages. Analysis of the data led to the identification of two properties of the dynamics: (1) the equilibrium in the CD4(+)CD25(+)FoxP3(+)T(regs) population is maintained over both precursor and mature T(regs) pools together, and (2) the ratio between precursor and mature T(regs) is inverted in the early years of adulthood. Then, using the model, we identified three biologically relevant scenarios that have the above properties: (1) the unique source of mature T(regs) is the antigen-driven differentiation of precursors that acquire the mature profile in the periphery and the proliferation of T(regs) is essential for the development and the maintenance of the pool; there exist other sources of mature T(regs), such as (2) a homeostatic density-dependent regulation or (3) thymus- or effector-derived T(regs), and in both cases, antigen-induced proliferation is not necessary for the development of a stable pool of T(regs). This is the first time that a mathematical model built to describe the in vivo dynamics of regulatory T cells is validated using human data. The application of this model provides an invaluable tool in estimating the amount of regulatory T cells as a function of time in the blood of patients that received a solid organ transplant or are suffering from an autoimmune disease.
Resumo:
A fully-automated 3D image analysis method is proposed to segment lung nodules in HRCT. A specific gray-level mathematical morphology operator, the SMDC-connection cost, acting in the 3D space of the thorax volume is defined in order to discriminate lung nodules from other dense (vascular) structures. Applied to clinical data concerning patients with pulmonary carcinoma, the proposed method detects isolated, juxtavascular and peripheral nodules with sizes ranging from 2 to 20 mm diameter. The segmentation accuracy was objectively evaluated on real and simulated nodules. The method showed a sensitivity and a specificity ranging from 85% to 97% and from 90% to 98%, respectively.
Resumo:
Vegeu el resum a l'inici del document de l'arxiu adjunt
Resumo:
In the first part of this research, three stages were stated for a program to increase the information extracted from ink evidence and maximise its usefulness to the criminal and civil justice system. These stages are (a) develop a standard methodology for analysing ink samples by high-performance thin layer chromatography (HPTLC) in reproducible way, when ink samples are analysed at different time, locations and by different examiners; (b) compare automatically and objectively ink samples; and (c) define and evaluate theoretical framework for the use of ink evidence in forensic context. This report focuses on the second of the three stages. Using the calibration and acquisition process described in the previous report, mathematical algorithms are proposed to automatically and objectively compare ink samples. The performances of these algorithms are systematically studied for various chemical and forensic conditions using standard performance tests commonly used in biometrics studies. The results show that different algorithms are best suited for different tasks. Finally, this report demonstrates how modern analytical and computer technology can be used in the field of ink examination and how tools developed and successfully applied in other fields of forensic science can help maximising its impact within the field of questioned documents.
Resumo:
Implementación y evaluación de un algoritmo híbrido que selecciona el conjunto de nodos de menor coste que permite desplegar un servicio, con una disponibilidad determinada, en un entorno de computación voluntaria.
Resumo:
El propòsit d'aquest TFC és investigar i fer una instal·lació des de zero d'un model de negoci basat en l'allotjament web fent servir tecnologies de Cloud Computing. El software open-source que es farà servir per aquesta finalitat serà Openstack el qual es basa en un model de servei com infraestructura (IaaS). La nostra finalitat és poder implementar el model IaaS basat en Openstack. Per duu a terme aquest desplegament es farà servir dos hipervisors (KVM i VMware ESXi) per tal de testejar diferents sistemes d¿hipervisors treballant conjuntament.
Resumo:
An analytic method to evaluate nuclear contributions to electrical properties of polyatomic molecules is presented. Such contributions control changes induced by an electric field on equilibrium geometry (nuclear relaxation contribution) and vibrational motion (vibrational contribution) of a molecular system. Expressions to compute the nuclear contributions have been derived from a power series expansion of the potential energy. These contributions to the electrical properties are given in terms of energy derivatives with respect to normal coordinates, electric field intensity or both. Only one calculation of such derivatives at the field-free equilibrium geometry is required. To show the useful efficiency of the analytical evaluation of electrical properties (the so-called AEEP method), results for calculations on water and pyridine at the SCF/TZ2P and the MP2/TZ2P levels of theory are reported. The results obtained are compared with previous theoretical calculations and with experimental values
Resumo:
El presente documento introduce a las pequeñas y medianas empresas en el mundo de la virtualización y el cloud computing. Partiendo de la presentación de ambas tecnologías, se recorren las diferentes fases por las que atraviesa un proyecto tecnológico consistente en la instalación de una plataforma virtualizada que alberga los sistemas informáticos básicos en una PYME.
Resumo:
Different procedures to obtain atom condensed Fukui functions are described. It is shown how the resulting values may differ depending on the exact approach to atom condensed Fukui functions. The condensed Fukui function can be computed using either the fragment of molecular response approach or the response of molecular fragment approach. The two approaches are nonequivalent; only the latter approach corresponds in general with a population difference expression. The Mulliken approach does not depend on the approach taken but has some computational drawbacks. The different resulting expressions are tested for a wide set of molecules. In practice one must make seemingly arbitrary choices about how to compute condensed Fukui functions, which suggests questioning the role of these indicators in conceptual density-functional theory
Resumo:
The alignment between competences, teaching-learning methodologies and assessment is a key element of the European Higher Education Area. This paper presents the efforts carried out by six Telematics, Computer Science and Electronic Engineering Education teachers towards achieving this alignment in their subjects. In a joint work with pedagogues, a set of recommended actions were identified. A selection of these actions were applied and evaluated in the six subjects. The cross-analysis of the results indicate that the actions allow students to better understand the methodologies and assessment planned for the subjects, facilitate (self-) regulation and increase students’ involvement in the subjects.
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.