897 resultados para Service Oriented Computing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern networks are undergoing a fast and drastic evolution, with software taking a more predominant role. Virtualization and cloud-like approaches are replacing physical network appliances, reducing the management burden of the operators. Furthermore, networks now expose programmable interfaces for fast and dynamic control over traffic forwarding. This evolution is backed by standard organizations such as ETSI, 3GPP, and IETF. This thesis will describe which are the main trends in this evolution. Then, it will present solutions developed during the three years of Ph.D. to exploit the capabilities these new technologies offer and to study their possible limitations to push further the state-of-the-art. Namely, it will deal with programmable network infrastructure, introducing the concept of Service Function Chaining (SFC) and presenting two possible solutions, one with Openstack and OpenFlow and the other using Segment Routing and IPv6. Then, it will continue with network service provisioning, presenting concepts from Network Function Virtualization (NFV) and Multi-access Edge Computing (MEC). These concepts will be applied to network slicing for mission-critical communications and Industrial IoT (IIoT). Finally, it will deal with network abstraction, with a focus on Intent Based Networking (IBN). To summarize, the thesis will include solutions for data plane programming with evaluation on well-known platforms, performance metrics on virtual resource allocations, novel practical application of network slicing on mission-critical communications, an architectural proposal and its implementation for edge technologies in Industrial IoT scenarios, and a formal definition of intent using a category theory approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pervasive availability of connected devices in any industrial and societal sector is pushing for an evolution of the well-established cloud computing model. The emerging paradigm of the cloud continuum embraces this decentralization trend and envisions virtualized computing resources physically located between traditional datacenters and data sources. By totally or partially executing closer to the network edge, applications can have quicker reactions to events, thus enabling advanced forms of automation and intelligence. However, these applications also induce new data-intensive workloads with low-latency constraints that require the adoption of specialized resources, such as high-performance communication options (e.g., RDMA, DPDK, XDP, etc.). Unfortunately, cloud providers still struggle to integrate these options into their infrastructures. That risks undermining the principle of generality that underlies the cloud computing scale economy by forcing developers to tailor their code to low-level APIs, non-standard programming models, and static execution environments. This thesis proposes a novel system architecture to empower cloud platforms across the whole cloud continuum with Network Acceleration as a Service (NAaaS). To provide commodity yet efficient access to acceleration, this architecture defines a layer of agnostic high-performance I/O APIs, exposed to applications and clearly separated from the heterogeneous protocols, interfaces, and hardware devices that implement it. A novel system component embodies this decoupling by offering a set of agnostic OS features to applications: memory management for zero-copy transfers, asynchronous I/O processing, and efficient packet scheduling. This thesis also explores the design space of the possible implementations of this architecture by proposing two reference middleware systems and by adopting them to support interactive use cases in the cloud continuum: a serverless platform and an Industry 4.0 scenario. A detailed discussion and a thorough performance evaluation demonstrate that the proposed architecture is suitable to enable the easy-to-use, flexible integration of modern network acceleration into next-generation cloud platforms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The scientific success of the LHC experiments at CERN highly depends on the availability of computing resources which efficiently store, process, and analyse the amount of data collected every year. This is ensured by the Worldwide LHC Computing Grid infrastructure that connect computing centres distributed all over the world with high performance network. LHC has an ambitious experimental program for the coming years, which includes large investments and improvements both for the hardware of the detectors and for the software and computing systems, in order to deal with the huge increase in the event rate expected from the High Luminosity LHC (HL-LHC) phase and consequently with the huge amount of data that will be produced. Since few years the role of Artificial Intelligence has become relevant in the High Energy Physics (HEP) world. Machine Learning (ML) and Deep Learning algorithms have been successfully used in many areas of HEP, like online and offline reconstruction programs, detector simulation, object reconstruction, identification, Monte Carlo generation, and surely they will be crucial in the HL-LHC phase. This thesis aims at contributing to a CMS R&D project, regarding a ML "as a Service" solution for HEP needs (MLaaS4HEP). It consists in a data-service able to perform an entire ML pipeline (in terms of reading data, processing data, training ML models, serving predictions) in a completely model-agnostic fashion, directly using ROOT files of arbitrary size from local or distributed data sources. This framework has been updated adding new features in the data preprocessing phase, allowing more flexibility to the user. Since the MLaaS4HEP framework is experiment agnostic, the ATLAS Higgs Boson ML challenge has been chosen as physics use case, with the aim to test MLaaS4HEP and the contribution done with this work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Negli ultimi anni la necessità di processare e mantenere dati di qualsiasi natura è aumentata considerevolmente, in aggiunta a questo, l’obsolescenza del modello centralizzato ha contribuito alla sempre più frequente adozione del modello distribuito. Inevitabile dunque l’aumento di traffico che attraversa i nodi appartenenti alle infrastrutture, un traffico sempre più in aumento e che con l’avvento dell’IoT, dei Big Data, del Cloud Computing, del Serverless Computing etc., ha raggiunto picchi elevatissimi. Basti pensare che se prima i dati erano contenuti in loco, oggi non è assurdo pensare che l’archiviazione dei propri dati sia completamente affidata a terzi. Così come cresce, quindi, il traffico che attraversa i nodi facenti parte di un’infrastruttura, cresce la necessità che questo traffico sia filtrato e gestito dai nodi stessi. L’obbiettivo di questa tesi è quello di estendere un Message-oriented Middleware, in grado di garantire diverse qualità di servizio per la consegna di messaggi, in modo da accelerarne la fase di routing verso i nodi destinazione. L’estensione consiste nell’aggiungere al Message-oriented Middleware, precedentemente implementato, la funzione di intercettare i pacchetti in arrivo (che nel caso del middleware in questione possono rappresentare la propagazione di eventi) e redirigerli verso un nuovo nodo in base ad alcuni parametri. Il Message-oriented Middleware oggetto di tesi sarà considerato il message broker di un modello pub/sub, pertanto la redirezione deve avvenire con tempi molto bassi di latenza e, a tal proposito, deve avvenire senza l’uscita dal kernel space del sistema operativo. Per questo motivo si è deciso di utilizzare eBPF, in particolare il modulo XDP, che permette di scrivere programmi che eseguono all’interno del kernel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article analyzed whether the practices of hearing health care were consistent with the principles of universality, comprehensiveness and equity from the standpoint of professionals. It involved qualitative research conducted at a Medium Complexity Hearing Health Care Center. A social worker, three speech therapists, a physician and a psychologist constituted the study subjects. Interviews were conducted as well as observation registered in a field diary. The thematic analysis technique was used in the analysis of the material. The analysis of interviews resulted in the construction of the following themes: Universality and access to hearing health, Comprehensive Hearing Health Care and Hearing Health and Equity. The study identified issues that interfere with the quality of service and run counter to the principles of Brazilian Unified Health System. The conclusion reached was that a relatively simple investment in training and professional qualification can bring about significant changes in order to promote a more universal, comprehensive and equitable health service.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Epidemiologic aspects of traumatic dental injuries (TDI) were evaluated in the permanent dentition in a sample of 847 patients treated at the Dental Urgency Service of the Dental School of the Federal University of Goiás, Brazil, between May 2000 and May 2008. The statistical treatment analyzed data from frequency distribution and chi-square test. The level of significance was set at 5% for all analyses. The results showed a higher incident among males (610; 72.01%) with mean age of 6-10 year-old. Uncomplicated crown fracture (without pulp exposure) (502; 26.95%), avulsion (341; 18.30%) and complicated crown fracture (with pulp exposure) (330; 17.71%) were the most prevalent TDI. The prevalence of trauma throughout the years showed proportionality, being observed a larger number of cases between July and September (249; 29.39%). The most affected teeth were the maxillary central incisors (65.65%), followed by the maxillary left lateral incisors (19.67%). In 311 participants (18.25%), only one tooth was involved, while in most patients (536; 81.75%), TDI occurred in more than one tooth. Significant proportion (82.27%) of traumatized teeth presented completely formed root apex. The main etiologic factors involved in TDI were falls (51.71%), traffic accidents (22.90%) and violence (5.67%). Based on the obtained data, it may be concluded that accurate policies of TDI prevention must be established, capable of stimulating the exposure of appropriate protocols for management of these lesions. The prevalence of TDI in Goiânia subpopulation is compared to the prevalence reported in epidemiological studies in others populations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To assess trends in the prevalence and social distribution of child stunting in Brazil to evaluate the effect of income and basic service redistribution policies implemented in that country in the recent past. Methods The prevalence of stunting (height-for-age z score below \22122 using the Child Growth Standards of the World Health Organization) among children aged less than 5 years was estimated from data collected during national household surveys carried out in Brazil in 1974\201375 (n = 34 409), 1989 (n = 7374), 1996 (n = 4149) and 2006\201307 (n = 4414). Absolute and relative socioeconomic inequality in stunting was measured by means of the slope index and the concentration index of inequality, respectively. Findings Over a 33-year period, we documented a steady decline in the national prevalence of stunting from 37.1 per cent to 7.1 per cent. Prevalence dropped from 59.0 per cent to 11.2 per cent in the poorest quintile and from 12.1 per cent to 3.3 per cent among the wealthiest quintile. The decline was particularly steep in the last 10 years of the period (1996 to 2007), when the gaps between poor and wealthy families with children under 5 were also reduced in terms of purchasing power; access to education, health care and water and sanitation services; and reproductive health indicators.Conclusion In Brazil, socioeconomic development coupled with equity-oriented public policies have been accompanied by marked improvements in living conditions and a substantial decline in child undernutrition, as well as a reduction of the gap in nutritional status between children in the highest and lowest socioeconomic quintiles. Future studies will show whether these gains will be maintained under the current global economic crisis

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two case studies are presented to describe the process of public school teachers authoring and creating chemistry simulations. They are part of the Virtual Didactic Laboratory for Chemistry, a project developed by the School of the Future of the University of Sao Paulo. the documental analysis of the material produced by two groups of teachers reflects different selection process for both themes and problem-situations when creating simulations. The study demonstrates the potential for chemistry learning with an approach that takes students' everyday lives into account and is based on collaborative work among teachers and researches. Also, from the teachers' perspectives, the possibilities of interaction that a simulation offers for classroom activities are considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the advent and development of technology, mainly in the Internet, more and more electronic services are being offered to customers in all areas of business, especially in the offering of information services, as in virtual libraries. This article proposes a new opportunity to provide services to virtual libraries customers, presenting a methodology for the implementation of electronic services oriented by these customers' life situations. Through analytical observations of some national virtual libraries sites, it could be identified that the offer of services considering life situations and relationship interest situations can promote the service to their customers, providing greater satisfaction and, consequently, improving quality in the offer of information services. The visits to those sites and the critical analysis of the data collected during these visits, supported by bibliographic researches results, have enabled the description of this methodology, concluding that the provision of services on an isolated way or in accordance with the user's profile on sites of virtual libraries is not always enough to ensure the attendance to the needs and expectations of its customers, which suggests the offering of these services considering life situations and relationship interest situations as a complement that adds value to the business of virtual library. This becomes relevant when indicates new opportunities to provide virtual libraries services with quality, serving as a guide to the information providers managers, enabling the offering of new means to access information services by such customers, looking for pro - activity and services integration, in order to solve definitely real problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Due to both the widespread and multipurpose use of document images and the current availability of a high number of document images repositories, robust information retrieval mechanisms and systems have been increasingly demanded. This paper presents an approach to support the automatic generation of relationships among document images by exploiting Latent Semantic Indexing (LSI) and Optical Character Recognition (OCR). We developed the LinkDI (Linking of Document Images) service, which extracts and indexes document images content, computes its latent semantics, and defines relationships among images as hyperlinks. LinkDI was experimented with document images repositories, and its performance was evaluated by comparing the quality of the relationships created among textual documents as well as among their respective document images. Considering those same document images, we ran further experiments in order to compare the performance of LinkDI when it exploits or not the LSI technique. Experimental results showed that LSI can mitigate the effects of usual OCR misrecognition, which reinforces the feasibility of LinkDI relating OCR output with high degradation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents studies of cases in power systems by Sensitivity Analysis (SA) oriented by Optimal Power Flow (OPF) problems in different operation scenarios. The studies of cases start from a known optimal solution obtained by OPF. This optimal solution is called base case, and from this solution new operation points may be evaluated by SA when perturbations occur in the system. The SA is based on Fiacco`s Theorem and has the advantage of not be an iterative process. In order to show the good performance of the proposed technique tests were carried out on the IEEE 14, 118 and 300 buses systems. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Product lifecycle management (PLM) innovates as it defines both the product as a central element to aggregate enterprise information and the lifecycle as a new time dimension for information integration and analysis. Because of its potential benefits to shorten innovation lead-times and to reduce costs, PLM has attracted a lot of attention at industry and at research. However, the current PLM implementation stage at most organisations still does not apply the lifecycle management concepts thoroughly. In order to close the existing realisation gap, this article presents a process oriented framework to support effective PLM implementation. The framework central point consists of a set of lifecycle oriented business process reference models which links the necessary fundamental concepts, enterprise knowledge and software solutions to effectively deploy PLM. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

How does knowledge management (KM) by a government agency responsible for environmental impact assessment (EIA) potentially contribute to better environmental assessment and management practice? Staff members at government agencies in charge of the EIA process are knowledge workers who perform judgement-oriented tasks highly reliant on individual expertise, but also grounded on the agency`s knowledge accumulated over the years. Part of an agency`s knowledge can be codified and stored in an organizational memory, but is subject to decay or loss if not properly managed. The EIA agency operating in Western Australia was used as a case study. Its KM initiatives were reviewed, knowledge repositories were identified and staff surveyed to gauge the utilisation and effectiveness of such repositories in enabling them to perform EIA tasks. Key elements of KM are the preparation of substantive guidance and spatial information management. It was found that treatment of cumulative impacts on the environment is very limited and information derived from project follow-up is not properly captured and stored, thus not used to create new knowledge and to improve practice and effectiveness. Other opportunities for improving organizational learning include the use of after-action reviews. The learning about knowledge management in EIA practice gained from Western Australian experience should be of value to agencies worldwide seeking to understand where best to direct their resources for their own knowledge repositories and environmental management practice. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High-density polyethylene resins have increasingly been used in the production of pipes for water- and gas-pressurized distribution systems and are expected to remain in service for several years, but they eventually fail prematurely by creep fracture. Usual standard methods used to rank resins in terms of their resistance to fracture are expensive and non-practical for quality control purposes, justifying the search for alternative methods. Essential work of fracture (EWF) method provides a relatively simple procedure to characterize the fracture behavior of ductile polymers, such as polyethylene resins. In the present work, six resins were analyzed using the EWF methodology. The results show that the plastic work dissipation factor, beta w(p), is the most reliable parameter to evaluate the performance. Attention must be given to specimen preparation that might result in excessive dispersion in the results, especially for the essential work of fracture w(e).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

View to service centre during Expo 1988.