964 resultados para pacs: neural computing technologies


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Al Large Hadron Collider (LHC) ogni anno di acquisizione dati vengono raccolti più di 30 petabyte di dati dalle collisioni. Per processare questi dati è necessario produrre un grande volume di eventi simulati attraverso tecniche Monte Carlo. Inoltre l'analisi fisica richiede accesso giornaliero a formati di dati derivati per centinaia di utenti. La Worldwide LHC Computing GRID (WLCG) è una collaborazione interazionale di scienziati e centri di calcolo che ha affrontato le sfide tecnologiche di LHC, rendendone possibile il programma scientifico. Con il prosieguo dell'acquisizione dati e la recente approvazione di progetti ambiziosi come l'High-Luminosity LHC, si raggiungerà presto il limite delle attuali capacità di calcolo. Una delle chiavi per superare queste sfide nel prossimo decennio, anche alla luce delle ristrettezze economiche dalle varie funding agency nazionali, consiste nell'ottimizzare efficientemente l'uso delle risorse di calcolo a disposizione. Il lavoro mira a sviluppare e valutare strumenti per migliorare la comprensione di come vengono monitorati i dati sia di produzione che di analisi in CMS. Per questa ragione il lavoro è comprensivo di due parti. La prima, per quanto riguarda l'analisi distribuita, consiste nello sviluppo di uno strumento che consenta di analizzare velocemente i log file derivanti dalle sottomissioni di job terminati per consentire all'utente, alla sottomissione successiva, di sfruttare meglio le risorse di calcolo. La seconda parte, che riguarda il monitoring di jobs sia di produzione che di analisi, sfrutta tecnologie nel campo dei Big Data per un servizio di monitoring più efficiente e flessibile. Un aspetto degno di nota di tali miglioramenti è la possibilità di evitare un'elevato livello di aggregazione dei dati già in uno stadio iniziale, nonché di raccogliere dati di monitoring con una granularità elevata che tuttavia consenta riprocessamento successivo e aggregazione “on-demand”.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Postprint

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Energy-efficient computing remains a critical challenge across the wide range of future data-processing engines — from ultra-low-power embedded systems to servers, mainframes, and supercomputers. In addition, the advent of cloud and mobile computing as well as the explosion of IoT technologies have created new research challenges in the already complex, multidimensional space of modern and future computer systems. These new research challenges led to the establishment of the IEEE Rebooting Computing Initiative, which specifically addresses novel low-power solutions and technologies as one of the main areas of concern.With this in mind, we thought it timely to survey the state of the art of energy-efficient computing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Computing Division of the Business School at University College Worcester provides computing and information technology education to a range of undergraduate students. Topics include various approaches to programming, artificial intelligence, operating systems and digital technologies. Each of these has its own potentially conflicting requirements for a pedagogically sound programming environment. This paper describes an endeavor to develop a common programming paradigm across all topics. This involves the combined use of autonomous robots and Java simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides an overview of IDS types and how they work as well as configuration considerations and issues that affect them. Advanced methods of increasing the performance of an IDS are explored such as specification based IDS for protecting Supervisory Control And Data Acquisition (SCADA) and Cloud networks. Also by providing a review of varied studies ranging from issues in configuration and specific problems to custom techniques and cutting edge studies a reference can be provided to others interested in learning about and developing IDS solutions. Intrusion Detection is an area of much required study to provide solutions to satisfy evolving services and networks and systems that support them. This paper aims to be a reference for IDS technologies other researchers and developers interested in the field of intrusion detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is a long-term study of the use of information and communication technologies by 30 older adults (ages 70–97) living in a large retirement community. The study spanned the years of 1996 to 2008, during which time the research participants grappled with the challenges of computer use while aging 12 years. The researcher, herself a ‘mature learner,’ used a qualitative research design which included observations and open-ended interviews. Using a strategy of “intermittent immersion,” she spent an average of two weeks per visit on site and participated in the lives of the research population in numerous ways, including service as their computer tutor. With e-mail and telephone contact, she was able to continue her interactions with participants throughout the 12-year period. A long-term perspective afforded the view of the evolution, devolution or cessation of the technology use by these older adults, and this process is chronicled in detail through five individual “profiles.” Three research questions dominated the inquiry: What function do computers serve in the lives of older adults? Does computer use foster or interfere with social ties? Is social support necessary for success in the face of challenging learning tasks? In answer to the first question, it became clear that computers were valued as a symbol of competence and intelligence. Some individuals brought their computers with them when transferred to the single-room residences of assisted living or nursing care facilities. Even when use had ceased, their computers were displayed to signal that their owners were or had once been keeping up to date. In answer to the second question, computer owners socialized around computing use (with in-person family members or friends) more than, or as much as, they socialized through their computers in the digital realm of the Internet. And in answer to the third question, while the existence of social support did facilitate computer exploration, more important was the social support network generated and developed among fellow computer users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação (mestrado)—Universidade de Brasília, Faculdade Gama, Programa de Pós-Graduação em Engenharia Biomédica, 2015.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual screening (VS) methods can considerably aid clinical research, predicting how ligands interact with drug targets. Most VS methods suppose a unique binding site for the target, but it has been demonstrated that diverse ligands interact with unrelated parts of the target and many VS methods do not take into account this relevant fact. This problem is circumvented by a novel VS methodology named BINDSURF that scans the whole protein surface in order to find new hotspots, where ligands might potentially interact with, and which is implemented in last generation massively parallel GPU hardware, allowing fast processing of large ligand databases. BINDSURF can thus be used in drug discovery, drug design, drug repurposing and therefore helps considerably in clinical research. However, the accuracy of most VS methods and concretely BINDSURF is constrained by limitations in the scoring function that describes biomolecular interactions, and even nowadays these uncertainties are not completely understood. In order to improve accuracy of the scoring functions used in BINDSURF we propose a hybrid novel approach where neural networks (NNET) and support vector machines (SVM) methods are trained with databases of known active (drugs) and inactive compounds, being this information exploited afterwards to improve BINDSURF VS predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Virtual Screening (VS) methods can considerably aid clinical research, predicting how ligands interact with drug targets. However, the accuracy of most VS methods is constrained by limitations in the scoring function that describes biomolecular interactions, and even nowadays these uncertainties are not completely understood. In order to improve accuracy of scoring functions used in most VS methods we propose a hybrid novel approach where neural networks (NNET) and support vector machines (SVM) methods are trained with databases of known active (drugs) and inactive compounds, this information being exploited afterwards to improve VS predictions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current trends in broadband mobile networks are addressed towards the placement of different capabilities at the edge of the mobile network in a centralised way. On one hand, the split of the eNB between baseband processing units and remote radio headers makes it possible to process some of the protocols in centralised premises, likely with virtualised resources. On the other hand, mobile edge computing makes use of processing and storage capabilities close to the air interface in order to deploy optimised services with minimum delay. The confluence of both trends is a hot topic in the definition of future 5G networks. The full centralisation of both technologies in cloud data centres imposes stringent requirements to the fronthaul connections in terms of throughput and latency. Therefore, all those cells with limited network access would not be able to offer these types of services. This paper proposes a solution for these cases, based on the placement of processing and storage capabilities close to the remote units, which is especially well suited for the deployment of clusters of small cells. The proposed cloud-enabled small cells include a highly efficient microserver with a limited set of virtualised resources offered to the cluster of small cells. As a result, a light data centre is created and commonly used for deploying centralised eNB and mobile edge computing functionalities. The paper covers the proposed architecture, with special focus on the integration of both aspects, and possible scenarios of application.