937 resultados para Distributed non-coherent shared memory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power system policies are broadly on track to escalate the use of renewable energy resources in electric power generation. Integration of dispersed generation to the utility network not only intensifies the benefits of renewable generation but also introduces further advantages such as power quality enhancement and freedom of power generation for the consumers. However, issues arise from the integration of distributed generators to the existing utility grid are as significant as its benefits. The issues are aggravated as the number of grid-connected distributed generators increases. Therefore, power quality demands become stricter to ensure a safe and proper advancement towards the emerging smart grid. In this regard, system protection is the area that is highly affected as the grid-connected distributed generation share in electricity generation increases. Islanding detection, amongst all protection issues, is the most important concern for a power system with high penetration of distributed sources. Islanding occurs when a portion of the distribution network which includes one or more distributed generation units and local loads is disconnected from the remaining portion of the grid. Upon formation of a power island, it remains energized due to the presence of one or more distributed sources. This thesis introduces a new islanding detection technique based on an enhanced multi-layer scheme that shows superior performance over the existing techniques. It provides improved solutions for safety and protection of power systems and distributed sources that are capable of operating in grid-connected mode. The proposed active method offers negligible non-detection zone. It is applicable to micro-grids with a number of distributed generation sources without sacrificing the dynamic response of the system. In addition, the information obtained from the proposed scheme allows for smooth transition to stand-alone operation if required. The proposed technique paves the path towards a comprehensive protection solution for future power networks. The proposed method is converter-resident and all power conversion systems that are operating based on power electronics converters can benefit from this method. The theoretical analysis is presented, and extensive simulation results confirm the validity of the analytical work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments have made researchers to reconsider Lagrangian measurement techniques as an alternative to their Eulerian counterpart when investigating non-stationary flows. This thesis advances the state-of-the-art of Lagrangian measurement techniques by pursuing three different objectives: (i) developing new Lagrangian measurement techniques for difficult-to-measure, in situ flow environments; (ii) developing new post-processing strategies designed for unstructured Lagrangian data, as well as providing guidelines towards their use; and (iii) presenting the advantages that the Lagrangian framework has over their Eulerian counterpart in various non-stationary flow problems. Towards the first objective, a large-scale particle tracking velocimetry apparatus is designed for atmospheric surface layer measurements. Towards the second objective, two techniques, one for identifying Lagrangian Coherent Structures (LCS) and the other for characterizing entrainment directly from unstructured Lagrangian data, are developed. Finally, towards the third objective, the advantages of Lagrangian-based measurements are showcased in two unsteady flow problems: the atmospheric surface layer, and entrainment in a non-stationary turbulent flow. Through developing new experimental and post-processing strategies for Lagrangian data, and through showcasing the advantages of Lagrangian data in various non-stationary flows, the thesis works to help investigators to more easily adopt Lagrangian-based measurement techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LysR-type transcriptional regulators (LTTRs) are emerging as key circuit components in regulating microbial stress responses and are implicated in modulating oxidative stress in the human opportunistic pathogen Pseudomonas aeruginosa. The oxidative stress response encapsulates several strategies to overcome the deleterious effects of reactive oxygen species. However, many of the regulatory components and associated molecular mechanisms underpinning this key adaptive response remain to be characterised. Comparative analysis of publically available transcriptomic datasets led to the identification of a novel LTTR, PA2206, whose expression was altered in response to a range of host signals in addition to oxidative stress. PA2206 was found to be required for tolerance to H2O2 in vitro and lethality in vivo in the Zebrafish embryo model of infection. Transcriptomic analysis in the presence of H2O2 showed that PA2206 altered the expression of 58 genes, including a large repertoire of oxidative stress and iron responsive genes, independent of the master regulator of oxidative stress, OxyR. Contrary to the classic mechanism of LysR regulation, PA2206 did not autoregulate its own expression and did not influence expression of adjacent or divergently transcribed genes. The PA2214-15 operon was identified as a direct target of PA2206 with truncated promoter fragments revealing binding to the 5'-ATTGCCTGGGGTTAT-3' LysR box adjacent to the predicted -35 region. PA2206 also interacted with the pvdS promoter suggesting a global dimension to the PA2206 regulon, and suggests PA2206 is an important regulatory component of P. aeruginosa adaptation during oxidative stress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Falls are common events in older people, which cause considerable morbidity and mortality. Non-pharmacological interventions are an important approach to prevent falls. There are a large number of systematic reviews of non-pharmacological interventions, whose evidence needs to be synthesized in order to facilitate evidence-based clinical decision making. Objectives: To systematically examine reviews and meta-analyses that evaluated non-pharmacological interventions to prevent falls in older adults in the community, care facilities and hospitals. Methods: We searched the electronic databases Pubmed, the Cochrane Database of Systematic Reviews, EMBASE, CINAHL, PsycINFO, PEDRO and TRIP from January 2009 to March 2015, for systematic reviews that included at least one comparative study, evaluating any non-pharmacological intervention, to prevent falls amongst older adults. The quality of the reviews was assessed using AMSTAR and ProFaNE taxonomy was used to organize the interventions. Results: Fifty-nine systematic reviews were identified which consisted of single, multiple and multi-factorial non-pharmacological interventions to prevent falls in older people. The most frequent ProFaNE defined interventions were exercises either alone or combined with other interventions, followed by environment/assistive technology interventions comprising environmental modifications, assistive and protective aids, staff education and vision assessment/correction. Knowledge was the third principle class of interventions as patient education. Exercise and multifactorial interventions were the most effective treatments to reduce falls in older adults, although not all types of exercise were equally effective in all subjects and in all settings. Effective exercise programs combined balance and strength training. Reviews with a higher AMSTAR score were more likely to contain more primary studies, to be updated and to perform meta-analysis. Conclusions: The aim of this overview of reviews of non-pharmacological interventions to prevent falls in older people in different settings, is to support clinicians and other healthcare workers with clinical decision-making by providing a comprehensive perspective of findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing emphasis on aid effectiveness, accountability and impact measurement in international development and humanitarian work has generated a requirement for high quality internal systems for the management of programmes. To help to address this requirement, Trócaire adopted Results Based Management in the 20 countries in which it works. This paper provides an overview of Trócaire’s RBM journey, including the process of embedding the new approach in the organisation, lessons learnt from this process, the subsequent benefits that are emerging at field programme level and the challenges going forward.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La radicalité du changement culturel provoque une crise de la forma ecclesiae et introduit la question relative à quelle est la forme de l’Église la plus appropriée à l’annonce de l’évangile. L’Église italienne, que le présent travail a l’intention d’assumer en qualité de point de référence particulier, en est également intéressée: contrairement au passé, même le plus récent, la foi chrétienne n’est plus un patrimoine «de tous». La question se pose puisque, au nom de l’évangile, l’action de l’Église ne peut pas disperser, pourtant, le caractère universel de la foi en soi même (son être «pour tous»). Dans un tel scénario, s’enclenche le but que cette thèse se pose de poursuivre sur le plan de la pensée théologique-pastorale: elle veut accompagner l’Église en son être à l’intérieur de cette tension entre l’instance théologique d’une foi qui est «pour tous» et le donné sociologique dont il émerge qu’elle n’est plus «de tous». Beaucoup de projets contemporaines de réforme pastorale ont l’intention de faire face aux transformations de la culture afin d’empêcher tout injustifiée domestication. Cependant, comme cette thèse essaie à le prouver, ils risquent souvent de suggérer une rupture avec le passé récent du corps ecclésial. Pour eux la référence polémique est représentée par cette figure de «catholicisme populaire» avec qui, dans le contexte de la «civilisation paroissiale», l’expérience chrétienne est réussie à s’enraciner dans le tissu social. Dans ces projets, il est – en effet – assimilé d’une manière restrictive à une sorte de «catholicisme de masse», basé seulement sur des processus religieux de socialisation et d’uniformisation de l’expérience. Au contraire, le but de ce travail consiste en un essai de compréhension renouvelée de cette figure de vie chrétienne. Elle n’est retenue seulement selon la particulière forme historique qu’elle a adoptée dans le demain passé, marqué par une situation d’homogénéité culturelle, d’une Église de peuple, mais aussi comme principe opératoire qui désigne la capacité du christianisme de se réaliser en tant qu’élévation et transfiguration des formes anthropologiques de base. Cette perspective dynamique permet de trouver dans le «catholicisme populaire» un principe écclesio-génétique qui exalte l’interaction entre l’initiative ecclésiale et la sensibilité des croyants, et qui, tout en défendant la qualité théologale de l’expérience chrétienne, ne méprise pas la valeur pédagogique de son enracinement religieux. La dynamique qui préside au «catholicisme populaire», grâce à la confrontation avec une étude sur le terrain, conduit à l’individuation de certaines provocations à propos de la structure du corps ecclésial, en ce qui concerne les représentations, les actions, les sujets et les limites qui le caractérisent. Elles sont transposées de manière à envisager une réforme de l’Église qui s’avère applicable pour le présent et qui cherche à garder le caractère universel-non formel de la foi, c’est à dire son «pour tous».

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Min/max autocorrelation factor analysis (MAFA) and dynamic factor analysis (DFA) are complementary techniques for analysing short (> 15-25 y), non-stationary, multivariate data sets. We illustrate the two techniques using catch rate (cpue) time-series (1982-2001) for 17 species caught during trawl surveys off Mauritania, with the NAO index, an upwelling index, sea surface temperature, and an index of fishing effort as explanatory variables. Both techniques gave coherent results, the most important common trend being a decrease in cpue during the latter half of the time-series, and the next important being an increase during the first half. A DFA model with SST and UPW as explanatory variables and two common trends gave good fits to most of the cpue time-series. (c) 2004 International Council for the Exploration of the Sea. Published by Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los niños que padecen trisomía 21 poseen una serie de características físicas, neurológicas y neuropsicológicas específicas, las cuales han sido investigadas a profundidad en diferentes países, de lo cual se han desarrollado protocolos de evaluación para estos niños acorde a su nacionalidad (García, 2010). A pesar de que Colombia es uno de los países en los cuales el síndrome de Down se presenta con mayor frecuencia, hasta la fecha, no se encuentran estudios que enfaticen en las habilidades neuropsicológicas de esta población específica, por lo cual no se han desarrollado protocolos de evaluación adecuados para los niños con síndrome este síndrome. Esta investigación se llevó acabo con una población de 88 niños a los cuales se les aplicó el inventario de desarrollo BATTELLE, y se identificó que los niños con síndrome Down de 5 a 12 años obtienen un puntaje que se encuentra en 4 desviaciones estándar por debajo de la media típica. Lo anterior demuestra una característica específica de esta población en cuanto a patrones de desarrollo en las cuales, se evidencia dificultad más importante en las área cognición y de la comunicación expresiva. Con respecto a los intervalos de edad se identificó que a lo largo de estos el desempeño en las áreas evaluadas decrece. esto puede estar relacionado con la mayor complejidad de los hitos del desarrollo para una edad esperada. Debido a que los hitos del desarrollo esperados varían a lo largo de los periodos del ciclo vital del ser humano, estos tienden a aumentar su complejidad en etapas del desarrollo más avanzados; como estos niños poseen una serie de dificultades en las funciones ejecutivas y cognición, no lograrán alcanzar dichos hitos del desarrollo.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article we use an autoregressive fractionally integrated moving average approach to measure the degree of fractional integration of aggregate world CO2 emissions and its five components – coal, oil, gas, cement, and gas flaring. We find that all variables are stationary and mean reverting, but exhibit long-term memory. Our results suggest that both coal and oil combustion emissions have the weakest degree of long-range dependence, while emissions from gas and gas flaring have the strongest. With evidence of long memory, we conclude that transitory policy shocks are likely to have long-lasting effects, but not permanent effects. Accordingly, permanent effects on CO2 emissions require a more permanent policy stance. In this context, if one were to rely only on testing for stationarity and non-stationarity, one would likely conclude in favour of non-stationarity, and therefore that even transitory policy shocks

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several decision and control tasks in cyber-physical networks can be formulated as large- scale optimization problems with coupling constraints. In these "constraint-coupled" problems, each agent is associated to a local decision variable, subject to individual constraints. This thesis explores the use of primal decomposition techniques to develop tailored distributed algorithms for this challenging set-up over graphs. We first develop a distributed scheme for convex problems over random time-varying graphs with non-uniform edge probabilities. The approach is then extended to unknown cost functions estimated online. Subsequently, we consider Mixed-Integer Linear Programs (MILPs), which are of great interest in smart grid control and cooperative robotics. We propose a distributed methodological framework to compute a feasible solution to the original MILP, with guaranteed suboptimality bounds, and extend it to general nonconvex problems. Monte Carlo simulations highlight that the approach represents a substantial breakthrough with respect to the state of the art, thus representing a valuable solution for new toolboxes addressing large-scale MILPs. We then propose a distributed Benders decomposition algorithm for asynchronous unreliable networks. The framework has been then used as starting point to develop distributed methodologies for a microgrid optimal control scenario. We develop an ad-hoc distributed strategy for a stochastic set-up with renewable energy sources, and show a case study with samples generated using Generative Adversarial Networks (GANs). We then introduce a software toolbox named ChoiRbot, based on the novel Robot Operating System 2, and show how it facilitates simulations and experiments in distributed multi-robot scenarios. Finally, we consider a Pickup-and-Delivery Vehicle Routing Problem for which we design a distributed method inspired to the approach of general MILPs, and show the efficacy through simulations and experiments in ChoiRbot with ground and aerial robots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Al giorno d'oggi il reinforcement learning ha dimostrato di essere davvero molto efficace nel machine learning in svariati campi, come ad esempio i giochi, il riconoscimento vocale e molti altri. Perciò, abbiamo deciso di applicare il reinforcement learning ai problemi di allocazione, in quanto sono un campo di ricerca non ancora studiato con questa tecnica e perchè questi problemi racchiudono nella loro formulazione un vasto insieme di sotto-problemi con simili caratteristiche, per cui una soluzione per uno di essi si estende ad ognuno di questi sotto-problemi. In questo progetto abbiamo realizzato un applicativo chiamato Service Broker, il quale, attraverso il reinforcement learning, apprende come distribuire l'esecuzione di tasks su dei lavoratori asincroni e distribuiti. L'analogia è quella di un cloud data center, il quale possiede delle risorse interne - possibilmente distribuite nella server farm -, riceve dei tasks dai suoi clienti e li esegue su queste risorse. L'obiettivo dell'applicativo, e quindi del data center, è quello di allocare questi tasks in maniera da minimizzare il costo di esecuzione. Inoltre, al fine di testare gli agenti del reinforcement learning sviluppati è stato creato un environment, un simulatore, che permettesse di concentrarsi nello sviluppo dei componenti necessari agli agenti, invece che doversi anche occupare di eventuali aspetti implementativi necessari in un vero data center, come ad esempio la comunicazione con i vari nodi e i tempi di latenza di quest'ultima. I risultati ottenuti hanno dunque confermato la teoria studiata, riuscendo a ottenere prestazioni migliori di alcuni dei metodi classici per il task allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il Museo Monumento al Deportato politico e razziale nel Castello dei Pio a Carpi (MO), a pochi passi dal Campo nazionale della deportazione razziale e politica di Fossoli, è il risultato di un concorso pubblico nazionale bandito nel 1963, frutto dell’impegno civile tra istituzioni, associazioni e intellettuali. Tra questi il gruppo BBPR il quale, in collaborazione con l’artista Renato Guttuso, si aggiudicheranno la vittoria del concorso. Il progetto vincitore, pur apportando alcune modifiche in fase di realizzazione, manterrà la sua impostazione antiretorica, utilizzando un linguaggio rigoroso e astratto. Partendo dalle caratteristiche che rendono quest’opera una struttura unica nel suo genere, obiettivo principale di questa ricerca di Dottorato è quello di restituire una genealogia del Museo Monumento al Deportato politico e razziale dei BBPR, ricostruendo il quadro culturale e politico italiano nel lasso di tempo che intercorre dalla fine della Seconda Guerra Mondiale (1945) e l’anno della sua inaugurazione (1973). Tale approccio metodologico scelto costituisce l’aspetto di novità della ricerca: un punto di vista ancora inedito con cui guardare il Museo-Monumento, differenziandosi, così, dalle più recenti pubblicazioni sullo stesso, le quali si concentrano soprattutto sulle logiche progettuali del Museo. In conclusione, lo scopo di questa tesi è quella di dare una nuova chiave interpretativa al Museo, che sia non solo un arricchimento alla sua conoscenza ma che, altresì, attesti l’esistenza di un’identità, altrettanto unica e irripetibile, della memoria della deportazione nella cultura architettonica italiana presa in esame, frutto di una “cultura condivisa” tra architetti, artisti, scrittori, politici e intellettuali, accumunati dalle tragiche vicende che in quest’opera si vogliono narrare.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analog In-memory Computing (AIMC) has been proposed in the context of Beyond Von Neumann architectures as a valid strategy to reduce internal data transfers energy consumption and latency, and to improve compute efficiency. The aim of AIMC is to perform computations within the memory unit, typically leveraging the physical features of memory devices. Among resistive Non-volatile Memories (NVMs), Phase-change Memory (PCM) has become a promising technology due to its intrinsic capability to store multilevel data. Hence, PCM technology is currently investigated to enhance the possibilities and the applications of AIMC. This thesis aims at exploring the potential of new PCM-based architectures as in-memory computational accelerators. In a first step, a preliminar experimental characterization of PCM devices has been carried out in an AIMC perspective. PCM cells non-idealities, such as time-drift, noise, and non-linearity have been studied to develop a dedicated multilevel programming algorithm. Measurement-based simulations have been then employed to evaluate the feasibility of PCM-based operations in the fields of Deep Neural Networks (DNNs) and Structural Health Monitoring (SHM). Moreover, a first testchip has been designed and tested to evaluate the hardware implementation of Multiply-and-Accumulate (MAC) operations employing PCM cells. This prototype experimentally demonstrates the possibility to reach a 95% MAC accuracy with a circuit-level compensation of cells time drift and non-linearity. Finally, empirical circuit behavior models have been included in simulations to assess the use of this technology in specific DNN applications, and to enhance the potentiality of this innovative computation approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The advent of Bitcoin suggested a disintermediated economy in which Internet users can take part directly. The conceptual disruption brought about by this Internet of Money (IoM) mirrors the cross-industry impacts of blockchain and distributed ledger technologies (DLTs). While related instances of non-centralisation thwart regulatory efforts to establish accountability, in the financial domain further challenges arise from the presence in the IoM of two seemingly opposing traits: anonymity and transparency. Indeed, DLTs are often described as architecturally transparent, but the perceived level of anonymity of cryptocurrency transfers fuels fears of illicit exploitation. This is a primary concern for the framework to prevent money laundering and the financing of terrorism and proliferation (AML/CFT/CPF), and a top priority both globally and at the EU level. Nevertheless, the anonymous and transparent features of the IoM are far from clear-cut, and the same is true for its levels of disintermediation and non-centralisation. Almost fifteen years after the first Bitcoin transaction, the IoM today comprises a diverse set of socio-technical ecosystems. Building on an analysis of their phenomenology, this dissertation shows how there is more to their traits of anonymity and transparency than it may seem, and how these features range across a spectrum of combinations and degrees. In this context, trade-offs can be evaluated by referring to techno-legal benchmarks, established through socio-technical assessments grounded on teleological interpretation. Against this backdrop, this work provides framework-level recommendations for the EU to respond to the twofold nature of the IoM legitimately and effectively. The methodology cherishes the mutual interaction between regulation and technology when drafting regulation whose compliance can be eased by design. This approach mitigates the risk of overfitting in a fast-changing environment, while acknowledging specificities in compliance with the risk-based approach that sits at the core of the AML/CFT/CPF regime.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.