961 resultados para Laboratorio remotorobotica mobileweb applicationsmodel driven software architecture
Resumo:
Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.
Resumo:
Nuclear cross sections are the pillars onto which the transport simulation of particles and radiations is built on. Since the nuclear data libraries production chain is extremely complex and made of different steps, it is mandatory to foresee stringent verification and validation procedures to be applied to it. The work here presented has been focused on the development of a new python based software called JADE, whose objective is to give a significant help in increasing the level of automation and standardization of these procedures in order to reduce the time passing between new libraries releases and, at the same time, increasing their quality. After an introduction to nuclear fusion (which is the field where the majority of the V\&V action was concentrated for the time being) and to the simulation of particles and radiations transport, the motivations leading to JADE development are discussed. Subsequently, the code general architecture and the implemented benchmarks (both experimental and computational) are described. After that, the results coming from the major application of JADE during the research years are presented. At last, after a final discussion on the objective reached by JADE, the possible brief, mid and long time developments for the project are discussed.
Resumo:
Imaging technologies are widely used in application fields such as natural sciences, engineering, medicine, and life sciences. A broad class of imaging problems reduces to solve ill-posed inverse problems (IPs). Traditional strategies to solve these ill-posed IPs rely on variational regularization methods, which are based on minimization of suitable energies, and make use of knowledge about the image formation model (forward operator) and prior knowledge on the solution, but lack in incorporating knowledge directly from data. On the other hand, the more recent learned approaches can easily learn the intricate statistics of images depending on a large set of data, but do not have a systematic method for incorporating prior knowledge about the image formation model. The main purpose of this thesis is to discuss data-driven image reconstruction methods which combine the benefits of these two different reconstruction strategies for the solution of highly nonlinear ill-posed inverse problems. Mathematical formulation and numerical approaches for image IPs, including linear as well as strongly nonlinear problems are described. More specifically we address the Electrical impedance Tomography (EIT) reconstruction problem by unrolling the regularized Gauss-Newton method and integrating the regularization learned by a data-adaptive neural network. Furthermore we investigate the solution of non-linear ill-posed IPs introducing a deep-PnP framework that integrates the graph convolutional denoiser into the proximal Gauss-Newton method with a practical application to the EIT, a recently introduced promising imaging technique. Efficient algorithms are then applied to the solution of the limited electrods problem in EIT, combining compressive sensing techniques and deep learning strategies. Finally, a transformer-based neural network architecture is adapted to restore the noisy solution of the Computed Tomography problem recovered using the filtered back-projection method.
Resumo:
Modern High-Performance Computing HPC systems are gradually increasing in size and complexity due to the correspondent demand of larger simulations requiring more complicated tasks and higher accuracy. However, as side effects of the Dennard’s scaling approaching its ultimate power limit, the efficiency of software plays also an important role in increasing the overall performance of a computation. Tools to measure application performance in these increasingly complex environments provide insights into the intricate ways in which software and hardware interact. The monitoring of the power consumption in order to save energy is possible through processors interfaces like Intel Running Average Power Limit RAPL. Given the low level of these interfaces, they are often paired with an application-level tool like Performance Application Programming Interface PAPI. Since several problems in many heterogeneous fields can be represented as a complex linear system, an optimized and scalable linear system solver algorithm can decrease significantly the time spent to compute its resolution. One of the most widely used algorithms deployed for the resolution of large simulation is the Gaussian Elimination, which has its most popular implementation for HPC systems in the Scalable Linear Algebra PACKage ScaLAPACK library. However, another relevant algorithm, which is increasing in popularity in the academic field, is the Inhibition Method. This thesis compares the energy consumption of the Inhibition Method and Gaussian Elimination from ScaLAPACK to profile their execution during the resolution of linear systems above the HPC architecture offered by CINECA. Moreover, it also collates the energy and power values for different ranks, nodes, and sockets configurations. The monitoring tools employed to track the energy consumption of these algorithms are PAPI and RAPL, that will be integrated with the parallel execution of the algorithms managed with the Message Passing Interface MPI.
Resumo:
The idea of Grid Computing originated in the nineties and found its concrete applications in contexts like the SETI@home project where a lot of computers (offered by volunteers) cooperated, performing distributed computations, inside the Grid environment analyzing radio signals trying to find extraterrestrial life. The Grid was composed of traditional personal computers but, with the emergence of the first mobile devices like Personal Digital Assistants (PDAs), researchers started theorizing the inclusion of mobile devices into Grid Computing; although impressive theoretical work was done, the idea was discarded due to the limitations (mainly technological) of mobile devices available at the time. Decades have passed, and now mobile devices are extremely more performant and numerous than before, leaving a great amount of resources available on mobile devices, such as smartphones and tablets, untapped. Here we propose a solution for performing distributed computations over a Grid Computing environment that utilizes both desktop and mobile devices, exploiting the resources from day-to-day mobile users that alternatively would end up unused. The work starts with an introduction on what Grid Computing is, the evolution of mobile devices, the idea of integrating such devices into the Grid and how to convince device owners to participate in the Grid. Then, the tone becomes more technical, starting with an explanation on how Grid Computing actually works, followed by the technical challenges of integrating mobile devices into the Grid. Next, the model, which constitutes the solution offered by this study, is explained, followed by a chapter regarding the realization of a prototype that proves the feasibility of distributed computations over a Grid composed by both mobile and desktop devices. To conclude future developments and ideas to improve this project are presented.
Resumo:
We report the observation of multiple harmonic generation in electric dipole spin resonance in an InAs nanowire double quantum dot. The harmonics display a remarkable detuning dependence: near the interdot charge transition as many as eight harmonics are observed, while at large detunings we only observe the fundamental spin resonance condition. The detuning dependence indicates that the observed harmonics may be due to Landau-Zener transition dynamics at anticrossings in the energy level spectrum.
Resumo:
The objective of the study was to illustrate the applicability and significance of the novel Lewis urothelial cancer model compared to the classic Fisher 344. Fischer 344 and Lewis females rats, 7 weeks old, were intravesical instilled N-methyl-N-nitrosourea 1.5 mg/kg every other week for a total of four doses. After 15 weeks, animals were sacrificed and bladders analyzed: histopathology (tumor grade and stage), immunohistochemistry (apoptotic and proliferative indices) and blotting (Toll-like receptor 2-TLR2, Uroplakin III-UP III and C-Myc). Control groups received placebo. There were macroscopic neoplastic lesions in 20 % of Lewis strain and 70 % of Fischer 344 strain. Lewis showed hyperplasia in 50 % of animals, normal bladders in 50 %. All Fischer 344 had lesions, 20 % papillary hyperplasia, 30 % dysplasia, 40 % neoplasia and 10 % squamous metaplasia. Proliferative and apoptotic indices were significantly lower in the Lewis strain (p < 0.01). The TLR2 and UP III protein levels were significantly higher in Lewis compared to Fischer 344 strain (70.8 and 46.5 % vs. 49.5 and 16.9 %, respectively). In contrast, C-Myc protein levels were significantly higher in Fischer 344 (22.5 %) compared to Lewis strain (13.7 %). The innovative Lewis carcinogen resistance urothelial model represents a new strategy for translational research. Preservation of TLR2 and UP III defense mechanisms might drive diverse urothelial phenotypes during carcinogenesis in differently susceptible individuals.
Resumo:
This article aimed at comparing the accuracy of linear measurement tools of different commercial software packages. Eight fully edentulous dry mandibles were selected for this study. Incisor, canine, premolar, first molar and second molar regions were selected. Cone beam computed tomography (CBCT) images were obtained with i-CAT Next Generation. Linear bone measurements were performed by one observer on the cross-sectional images using three different software packages: XoranCat®, OnDemand3D® and KDIS3D®, all able to assess DICOM images. In addition, 25% of the sample was reevaluated for the purpose of reproducibility. The mandibles were sectioned to obtain the gold standard for each region. Intraclass coefficients (ICC) were calculated to examine the agreement between the two periods of evaluation; the one-way analysis of variance performed with the post-hoc Dunnett test was used to compare each of the software-derived measurements with the gold standard. The ICC values were excellent for all software packages. The least difference between the software-derived measurements and the gold standard was obtained with the OnDemand3D and KDIS3D (-0.11 and -0.14 mm, respectively), and the greatest, with the XoranCAT (+0.25 mm). However, there was no statistical significant difference between the measurements obtained with the different software packages and the gold standard (p> 0.05). In conclusion, linear bone measurements were not influenced by the software package used to reconstruct the image from CBCT DICOM data.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física
Resumo:
This paper proposes an architecture for machining process and production monitoring to be applied in machine tools with open Computer numerical control (CNC). A brief description of the advantages of using open CNC for machining process and production monitoring is presented with an emphasis on the CNC architecture using a personal computer (PC)-based human-machine interface. The proposed architecture uses the CNC data and sensors to gather information about the machining process and production. It allows the development of different levels of monitoring systems with mininium investment, minimum need for sensor installation, and low intrusiveness to the process. Successful examples of the utilization of this architecture in a laboratory environment are briefly described. As a Conclusion, it is shown that a wide range of monitoring solutions can be implemented in production processes using the proposed architecture.
Resumo:
Nucleoside hydrolases (NHs) show homology among parasite protozoa, fungi and bacteria. They are vital protagonists in the establishment of early infection and, therefore, are excellent candidates for the pathogen recognition by adaptive immune responses. Immune protection against NHs would prevent disease at the early infection of several pathogens. We have identified the domain of the NH of L. donovani (NH36) responsible for its immunogenicity and protective efficacy against murine visceral leishmaniasis (VL). Using recombinant generated peptides covering the whole NH36 sequence and saponin we demonstrate that protection against L. chagasi is related to its C-terminal domain (amino-acids 199-314) and is mediated mainly by a CD4+ T cell driven response with a lower contribution of CD8+ T cells. Immunization with this peptide exceeds in 36.73 +/- 12.33% the protective response induced by the cognate NH36 protein. Increases in IgM, IgG2a, IgG1 and IgG2b antibodies, CD4+ T cell proportions, IFN-gamma secretion, ratios of IFN-gamma/IL-10 producing CD4+ and CD8+ T cells and percents of antibody binding inhibition by synthetic predicted epitopes were detected in F3 vaccinated mice. The increases in DTH and in ratios of TNF alpha/IL-10 CD4+ producing cells were however the strong correlates of protection which was confirmed by in vivo depletion with monoclonal antibodies, algorithm predicted CD4 and CD8 epitopes and a pronounced decrease in parasite load (90.5-88.23%; p = 0.011) that was long-lasting. No decrease in parasite load was detected after vaccination with the N-domain of NH36, in spite of the induction of IFN-gamma/IL-10 expression by CD4+ T cells after challenge. Both peptides reduced the size of footpad lesions, but only the C-domain reduced the parasite load of mice challenged with L. amazonensis. The identification of the target of the immune response to NH36 represents a basis for the rationale development of a bivalent vaccine against leishmaniasis and for multivalent vaccines against NHs-dependent pathogens.
Resumo:
The interplay between the biocolloidal characteristics (especially size and charge), pH, salt concentration and the thermal energy results in a unique collection of mesoscopic forces of importance to the molecular organization and function in biological systems. By means of Monte Carlo simulations and semi-quantitative analysis in terms of perturbation theory, we describe a general electrostatic mechanism that gives attraction at low electrolyte concentrations. This charge regulation mechanism due to titrating amino acid residues is discussed in a purely electrostatic framework. The complexation data reported here for interaction between a polyelectrolyte chain and the proteins albumin, goat and bovine alpha-lactalbumin, beta-lactoglobulin, insulin, k-casein, lysozyme and pectin methylesterase illustrate the importance of the charge regulation mechanism. Special attention is given to pH congruent to pI where ion-dipole and charge regulation interactions could overcome the repulsive ion-ion interaction. By means of protein mutations, we confirm the importance of the charge regulation mechanism, and quantify when the complexation is dominated either by charge regulation or by the ion-dipole term.
Resumo:
Stingless bees exhibit extraordinary variation in nest architecture within and among species. To test for phylogenetic association of behavioral traits for species of the Neotropical stingless bee genus Trigona s.s., a phylogenetic hypothesis was generated by combining sequence data of 24 taxa from one mitochondrial (16S rRNA) and four nuclear gene fragments (long-wavelength rhodopsin copy 1 (opsin), elongation factor-1 alpha copy F2, arginine kinase, and 28S rRNA). Fifteen characteristics of the nest architecture were coded and tested for phylogenetic association. Several characters have significant phylogenetic signal, including type of nesting substrate, nest construction material, and hemipterophily, the tending of hemipteroid insects in exchange for sugar excretions. Phylogenetic independent habits encountered in Trigona s.s. include coprophily and necrophagy.
Resumo:
Background: A relative friability to capture a sufficiently large patient population in any one geographic location has traditionally limited research into rare diseases. Methods and Results: Clinicians interested in the rare disease lymphangioleiomyomatosis (LAM) have worked with the LAM Treatment Alliance, the MIT Media Lab, and Clozure Associates to cooperate in the design of a state-of-the-art data coordination platform that can be used for clinical trials and other research focused on the global LAM patient population. This platform is a component of a set of web-based resources, including a patient self-report data portal, aimed at accelerating research in rare diseases in a rigorous fashion. Conclusions: Collaboration between clinicians, researchers, advocacy groups, and patients can create essential community resource infrastructure to accelerate rare disease research. The International LAM Registry is an example of such an effort.
Resumo:
This paper presents SMarty, a variability management approach for UML-based software product lines (PL). SMarty is supported by a UML profile, the SMartyProfile, and a process for managing variabilities, the SMartyProcess. SMartyProfile aims at representing variabilities, variation points, and variants in UML models by applying a set of stereotypes. SMartyProcess consists of a set of activities that is systematically executed to trace, identify, and control variabilities in a PL based on SMarty. It also identifies variability implementation mechanisms and analyzes specific product configurations. In addition, a more comprehensive application of SMarty is presented using SEI's Arcade Game Maker PL. An evaluation of SMarty and related work are discussed.