930 resultados para System Computing
Resumo:
Plants of genus Schinus are native South America and introduced in Mediterranean countries, a long time ago. Some Schinus species have been used in folk medicine, and Essential Oils of Schinus spp. (EOs) have been reported as having antimicrobial, anti-tumoural and anti-inflammatory properties. Such assets are related with the EOs chemical composition that depends largely on the species, the geographic and climatic region, and on the part of the plants used. Considering the difficulty to infer the pharmacological properties of EOs of Schinus species without a hard experimental setting, this work will focus on the development of an Artificial Intelligence grounded Decision Support System to predict pharmacological properties of Schinus EOs. The computational framework was built on top of a Logic Programming Case Base approach to knowledge representation and reasoning, which caters to the handling of incomplete, unknown, or even self-contradictory information. New clustering methods centered on an analysis of attribute’s similarities were used to distinguish and aggregate historical data according to the context under which it was added to the Case Base, therefore enhancing the prediction process.
Resumo:
This paper presents an integrated model for an offshore wind turbine taking into consideration a contribution for the marine wave and wind speed with perturbations influences on the power quality of current injected into the electric grid. The paper deals with the simulation of one floating offshore wind turbine equipped with a permanent magnet synchronous generator, and a two-level converter connected to an onshore electric grid. The use of discrete mass modeling is accessed in order to reveal by computing the total harmonic distortion on how the perturbations of the captured energy are attenuated at the electric grid injection point. Two torque actions are considered for the three-mass modeling, the aerodynamic on the flexible part and on the rigid part of the blades. Also, a torque due to the influence of marine waves in deep water is considered. Proportional integral fractional-order control supports the control strategy. A comparison between the drive train models is presented.
Resumo:
This paper presents an integrated model for an offshore wind energy system taking into consideration a contribution for the marine wave and wind speed with perturbations influences on the power quality of current injected into the electric grid. The paper deals with the simulation of one floating offshore wind turbine equipped with a PMSG and a two-level converter connected to an onshore electric grid. The use of discrete mass modeling is accessed in order to reveal by computing the THD on how the perturbations of the captured energy are attenuated at the electric grid injection point. Two torque actions are considered for the three-mass modeling, the aerodynamic on the flexible part and on the rigid part of the blades. Also, a torque due to the influence of marine waves in deep water is considered. PI fractional-order control supports the control strategy. A comparison between the drive train models is presented.
Resumo:
The AntiPhospholipid Syndrome (APS) is an acquired autoimmune disorder induced by high levels of antiphospholipid antibodies that cause arterial and veins thrombosis, as well as pregnancy-related complications and morbidity, as clinical manifestations. This autoimmune hypercoagulable state, usually known as Hughes syndrome, has severe consequences for the patients, being one of the main causes of thrombotic disorders and death. Therefore, it is required to be preventive; being aware of how probable is to have that kind of syndrome. Despite the updated of antiphospholipid syndrome classification, the diagnosis remains difficult to establish. Additional research on clinically relevant antibodies and standardization of their quantification are required in order to improve the antiphospholipid syndrome risk assessment. Thus, this work will focus on the development of a diagnosis decision support system in terms of a formal agenda built on a Logic Programming approach to knowledge representation and reasoning, complemented with a computational framework based on Artificial Neural Networks. The proposed model allows for improving the diagnosis, classifying properly the patients that really presented this pathology (sensitivity higher than 85%), as well as classifying the absence of APS (specificity close to 95%).
Resumo:
Internet of Things systems are pervasive systems evolved from cyber-physical to large-scale systems. Due to the number of technologies involved, software development involves several integration challenges. Among them, the ones preventing proper integration are those related to the system heterogeneity, and thus addressing interoperability issues. From a software engineering perspective, developers mostly experience the lack of interoperability in the two phases of software development: programming and deployment. On the one hand, modern software tends to be distributed in several components, each adopting its most-appropriate technology stack, pushing programmers to code in a protocol- and data-agnostic way. On the other hand, each software component should run in the most appropriate execution environment and, as a result, system architects strive to automate the deployment in distributed infrastructures. This dissertation aims to improve the development process by introducing proper tools to handle certain aspects of the system heterogeneity. Our effort focuses on three of these aspects and, for each one of those, we propose a tool addressing the underlying challenge. The first tool aims to handle heterogeneity at the transport and application protocol level, the second to manage different data formats, while the third to obtain optimal deployment. To realize the tools, we adopted a linguistic approach, i.e.\ we provided specific linguistic abstractions that help developers to increase the expressive power of the programming language they use, writing better solutions in more straightforward ways. To validate the approach, we implemented use cases to show that the tools can be used in practice and that they help to achieve the expected level of interoperability. In conclusion, to move a step towards the realization of an integrated Internet of Things ecosystem, we target programmers and architects and propose them to use the presented tools to ease the software development process.
Resumo:
Modern scientific discoveries are driven by an unsatisfiable demand for computational resources. High-Performance Computing (HPC) systems are an aggregation of computing power to deliver considerably higher performance than one typical desktop computer can provide, to solve large problems in science, engineering, or business. An HPC room in the datacenter is a complex controlled environment that hosts thousands of computing nodes that consume electrical power in the range of megawatts, which gets completely transformed into heat. Although a datacenter contains sophisticated cooling systems, our studies indicate quantitative evidence of thermal bottlenecks in real-life production workload, showing the presence of significant spatial and temporal thermal and power heterogeneity. Therefore minor thermal issues/anomalies can potentially start a chain of events that leads to an unbalance between the amount of heat generated by the computing nodes and the heat removed by the cooling system originating thermal hazards. Although thermal anomalies are rare events, anomaly detection/prediction in time is vital to avoid IT and facility equipment damage and outage of the datacenter, with severe societal and business losses. For this reason, automated approaches to detect thermal anomalies in datacenters have considerable potential. This thesis analyzed and characterized the power and thermal characteristics of a Tier0 datacenter (CINECA) during production and under abnormal thermal conditions. Then, a Deep Learning (DL)-powered thermal hazard prediction framework is proposed. The proposed models are validated against real thermal hazard events reported for the studied HPC cluster while in production. This thesis is the first empirical study of thermal anomaly detection and prediction techniques of a real large-scale HPC system to the best of my knowledge. For this thesis, I used a large-scale dataset, monitoring data of tens of thousands of sensors for around 24 months with a data collection rate of around 20 seconds.
Resumo:
Intelligent systems are currently inherent to the society, supporting a synergistic human-machine collaboration. Beyond economical and climate factors, energy consumption is strongly affected by the performance of computing systems. The quality of software functioning may invalidate any improvement attempt. In addition, data-driven machine learning algorithms are the basis for human-centered applications, being their interpretability one of the most important features of computational systems. Software maintenance is a critical discipline to support automatic and life-long system operation. As most software registers its inner events by means of logs, log analysis is an approach to keep system operation. Logs are characterized as Big data assembled in large-flow streams, being unstructured, heterogeneous, imprecise, and uncertain. This thesis addresses fuzzy and neuro-granular methods to provide maintenance solutions applied to anomaly detection (AD) and log parsing (LP), dealing with data uncertainty, identifying ideal time periods for detailed software analyses. LP provides deeper semantics interpretation of the anomalous occurrences. The solutions evolve over time and are general-purpose, being highly applicable, scalable, and maintainable. Granular classification models, namely, Fuzzy set-Based evolving Model (FBeM), evolving Granular Neural Network (eGNN), and evolving Gaussian Fuzzy Classifier (eGFC), are compared considering the AD problem. The evolving Log Parsing (eLP) method is proposed to approach the automatic parsing applied to system logs. All the methods perform recursive mechanisms to create, update, merge, and delete information granules according with the data behavior. For the first time in the evolving intelligent systems literature, the proposed method, eLP, is able to process streams of words and sentences. Essentially, regarding to AD accuracy, FBeM achieved (85.64+-3.69)%; eGNN reached (96.17+-0.78)%; eGFC obtained (92.48+-1.21)%; and eLP reached (96.05+-1.04)%. Besides being competitive, eLP particularly generates a log grammar, and presents a higher level of model interpretability.
Resumo:
The idea of Grid Computing originated in the nineties and found its concrete applications in contexts like the SETI@home project where a lot of computers (offered by volunteers) cooperated, performing distributed computations, inside the Grid environment analyzing radio signals trying to find extraterrestrial life. The Grid was composed of traditional personal computers but, with the emergence of the first mobile devices like Personal Digital Assistants (PDAs), researchers started theorizing the inclusion of mobile devices into Grid Computing; although impressive theoretical work was done, the idea was discarded due to the limitations (mainly technological) of mobile devices available at the time. Decades have passed, and now mobile devices are extremely more performant and numerous than before, leaving a great amount of resources available on mobile devices, such as smartphones and tablets, untapped. Here we propose a solution for performing distributed computations over a Grid Computing environment that utilizes both desktop and mobile devices, exploiting the resources from day-to-day mobile users that alternatively would end up unused. The work starts with an introduction on what Grid Computing is, the evolution of mobile devices, the idea of integrating such devices into the Grid and how to convince device owners to participate in the Grid. Then, the tone becomes more technical, starting with an explanation on how Grid Computing actually works, followed by the technical challenges of integrating mobile devices into the Grid. Next, the model, which constitutes the solution offered by this study, is explained, followed by a chapter regarding the realization of a prototype that proves the feasibility of distributed computations over a Grid composed by both mobile and desktop devices. To conclude future developments and ideas to improve this project are presented.
Resumo:
Bone marrow is organized in specialized microenvironments known as 'marrow niches'. These are important for the maintenance of stem cells and their hematopoietic progenitors whose homeostasis also depends on other cell types present in the tissue. Extrinsic factors, such as infection and inflammatory states, may affect this system by causing cytokine dysregulation (imbalance in cytokine production) and changes in cell proliferation and self-renewal rates, and may also induce changes in the metabolism and cell cycle. Known to relate to chronic inflammation, obesity is responsible for systemic changes that are best studied in the cardiovascular system. Little is known regarding the changes in the hematopoietic system induced by the inflammatory state carried by obesity or the cell and molecular mechanisms involved. The understanding of the biological behavior of hematopoietic stem cells under obesity-induced chronic inflammation could help elucidate the pathophysiological mechanisms involved in other inflammatory processes, such as neoplastic diseases and bone marrow failure syndromes.
Resumo:
To compare time and risk to biochemical recurrence (BR) after radical prostatectomy of two chronologically different groups of patients using the standard and the modified Gleason system (MGS). Cohort 1 comprised biopsies of 197 patients graded according to the standard Gleason system (SGS) in the period 1997/2004, and cohort 2, 176 biopsies graded according to the modified system in the period 2005/2011. Time to BR was analyzed with the Kaplan-Meier product-limit analysis and prediction of shorter time to recurrence using univariate and multivariate Cox proportional hazards model. Patients in cohort 2 reflected time-related changes: striking increase in clinical stage T1c, systematic use of extended biopsies, and lower percentage of total length of cancer in millimeter in all cores. The MGS used in cohort 2 showed fewer biopsies with Gleason score ≤ 6 and more biopsies of the intermediate Gleason score 7. Time to BR using the Kaplan-Meier curves showed statistical significance using the MGS in cohort 2, but not the SGS in cohort 1. Only the MGS predicted shorter time to BR on univariate analysis and on multivariate analysis was an independent predictor. The results favor that the 2005 International Society of Urological Pathology modified system is a refinement of the Gleason grading and valuable for contemporary clinical practice.
Resumo:
The mesoporous SBA-15 silica with uniform hexagonal pore, narrow pore size distribution and tuneable pore diameter was organofunctionalized with glutaraldehyde-bridged silylating agent. The precursor and its derivative silicas were ibuprofen-loaded for controlled delivery in simulated biological fluids. The synthesized silicas were characterized by elemental analysis, infrared spectroscopy, (13)C and (29)Si solid state NMR spectroscopy, nitrogen adsorption, X-ray diffractometry, thermogravimetry and scanning electron microscopy. Surface functionalization with amine containing bridged hydrophobic structure resulted in significantly decreased surface area from 802.4 to 63.0 m(2) g(-1) and pore diameter 8.0-6.0 nm, which ultimately increased the drug-loading capacity from 18.0% up to 28.3% and a very slow release rate of ibuprofen over the period of 72.5h. The in vitro drug release demonstrated that SBA-15 presented the fastest release from 25% to 27% and SBA-15GA gave near 10% of drug release in all fluids during 72.5 h. The Korsmeyer-Peppas model better fits the release data with the Fickian diffusion mechanism and zero order kinetics for synthesized mesoporous silicas. Both pore sizes and hydrophobicity influenced the rate of the release process, indicating that the chemically modified silica can be suggested to design formulation of slow and constant release over a defined period, to avoid repeated administration.
Resumo:
Two single crystalline surfaces of Au vicinal to the (111) plane were modified with Pt and studied using scanning tunneling microscopy (STM) and X-ray photoemission spectroscopy (XPS) in ultra-high vacuum environment. The vicinal surfaces studied are Au(332) and Au(887) and different Pt coverage (θPt) were deposited on each surface. From STM images we determine that Pt deposits on both surfaces as nanoislands with heights ranging from 1 ML to 3 ML depending on θPt. On both surfaces the early growth of Pt ad-islands occurs at the lower part of the step edge, with Pt ad-atoms being incorporated into the steps in some cases. XPS results indicate that partial alloying of Pt occurs at the interface at room temperature and at all coverage, as suggested by the negative chemical shift of Pt 4f core line, indicating an upward shift of the d-band center of the alloyed Pt. Also, the existence of a segregated Pt phase especially at higher coverage is detected by XPS. Sample annealing indicates that the temperature rise promotes a further incorporation of Pt atoms into the Au substrate as supported by STM and XPS results. Additionally, the catalytic activity of different PtAu systems reported in the literature for some electrochemical reactions is discussed considering our findings.
Resumo:
To evaluate the antimicrobial efficacy of Clearfil SE Protect (CP) and Clearfil SE Bond (CB) after curing and rinsed against five individual oral microorganisms as well as a mixture of bacterial culture prepared from the selected test organisms. Bacterial suspensions were prepared from single species of Streptococcus mutans, Streptococcus sobrinus, Streptococcus gordonii, Actinomyces viscosus and Lactobacillus lactis, as well as mixed bacterial suspensions from these organisms. Dentin bonding system discs (6 mm×2 mm) were prepared, cured, washed and placed on the bacterial suspension of single species or multispecies bacteria for 15, 30 and 60 min. MTT, Live/Dead bacterial viability (antibacterial effect), and XTT (metabolic activity) assays were used to test the two dentin system's antibacterial effect. All assays were done in triplicates and each experiment repeated at least three times. Data were submitted to ANOVA and Scheffe's f-test (5%). Greater than 40% bacteria killing was seen within 15 min, and the killing progressed with increasing time of incubation with CP discs. However, a longer (60 min) period of incubation was required by CP to achieve similar antimicrobial effect against mixed bacterial suspension. CB had no significant effect on the viability or metabolic activity of the test microorganisms when compared to the control bacterial culture. CP was significantly effective in reducing the viability and metabolic activity of the test organisms. The results demonstrated the antimicrobial efficacy of CP both on single and multispecies bacterial culture. CP may be beneficial in reducing bacterial infections in cavity preparations in clinical dentistry.
Resumo:
Association between hypertension and bladder symptoms has been described. We hypothesized that micturition dysfunction may be associated with renin-angiotensin system (RAS) acting in urethra. The effects of the anti-hypertensive drugs losartan (AT1 antagonist) and captopril (angiotensin-converting enzyme inhibitor) in comparison with atenolol (β1-adrenoceptor antagonist independently of RAS blockade) have been investigated in bladder and urethral dysfunctions during renovascular hypertension in rats. Two kidney-1 clip (2K-1C) rats were treated with losartan (30 mg/kg/day), captopril (50mg/kg/day) or atenolol (90 mg/kg/day) for eight weeks. Cystometric study, bladder and urethra smooth muscle reactivities, measurement of cAMP levels and p38 MAPK phosphorylation in urinary tract were determined. Losartan and captopril markedly reduced blood pressure in 2K-1C rats. The increases in non-voiding contractions, voiding frequency and bladder capacity in 2K-1C rats were prevented by treatments with both drugs. Likewise, losartan and captopril prevented the enhanced bladder contractions to electrical-field stimulation (EFS) and carbachol, along with the impaired relaxations to β-adrenergic-cAMP stimulation. Enhanced neurogenic contractions and impaired nitrergic relaxations were observed in urethra from 2K-1C rats. Angiotensin II also produced greater urethral contractions that were accompanied by higher phosphorylation of p38 MAPK in urethral tissues of 2K-1C rats. Losartan and captopril normalized the urethral dysfunctions in 2K-1C rats. In contrast, atenolol treatment largely reduced the blood pressure in 2K-1C rats but failed to affect the urinary tract smooth muscle dysfunction. The urinary tract smooth muscle dysfunction in 2K-1C rats takes place by local RAS activation irrespective of levels of arterial blood pressure.
Resumo:
One of the great challenges of the scientific community on theories of genetic information, genetic communication and genetic coding is to determine a mathematical structure related to DNA sequences. In this paper we propose a model of an intra-cellular transmission system of genetic information similar to a model of a power and bandwidth efficient digital communication system in order to identify a mathematical structure in DNA sequences where such sequences are biologically relevant. The model of a transmission system of genetic information is concerned with the identification, reproduction and mathematical classification of the nucleotide sequence of single stranded DNA by the genetic encoder. Hence, a genetic encoder is devised where labelings and cyclic codes are established. The establishment of the algebraic structure of the corresponding codes alphabets, mappings, labelings, primitive polynomials (p(x)) and code generator polynomials (g(x)) are quite important in characterizing error-correcting codes subclasses of G-linear codes. These latter codes are useful for the identification, reproduction and mathematical classification of DNA sequences. The characterization of this model may contribute to the development of a methodology that can be applied in mutational analysis and polymorphisms, production of new drugs and genetic improvement, among other things, resulting in the reduction of time and laboratory costs.