5 resultados para distributed agency
em CORA - Cork Open Research Archive - University College Cork - Ireland
Resumo:
A massive change is currently taking place in the manner in which power networks are operated. Traditionally, power networks consisted of large power stations which were controlled from centralised locations. The trend in modern power networks is for generated power to be produced by a diverse array of energy sources which are spread over a large geographical area. As a result, controlling these systems from a centralised controller is impractical. Thus, future power networks will be controlled by a large number of intelligent distributed controllers which must work together to coordinate their actions. The term Smart Grid is the umbrella term used to denote this combination of power systems, artificial intelligence, and communications engineering. This thesis focuses on the application of optimal control techniques to Smart Grids with a focus in particular on iterative distributed MPC. A novel convergence and stability proof for iterative distributed MPC based on the Alternating Direction Method of Multipliers is derived. Distributed and centralised MPC, and an optimised PID controllers' performance are then compared when applied to a highly interconnected, nonlinear, MIMO testbed based on a part of the Nordic power grid. Finally, a novel tuning algorithm is proposed for iterative distributed MPC which simultaneously optimises both the closed loop performance and the communication overhead associated with the desired control.
Resumo:
We review recent advances in all-optical OFDM technologies and discuss the performance of a field trial of a 2 Tbit/s Coherent WDM over 124 km with distributed Raman amplification. The results indicate that careful optimisation of the Raman pumps is essential. We also consider how all-optical OFDM systems perform favourably against energy consumption when compared with alternative coherent detection schemes. We argue that, in an energy constrained high-capacity transmission system, direct detected all-optical OFDM with 'ideal' Raman amplification is an attractive candidate for metro area datacentre interconnects with ~100 km fibre spans, with an overall energy requirement at least three times lower than coherent detection techniques.
Resumo:
Petrochemical plastics/polymers are a common feature of day to day living as they occur in packaging, furniture, mobile phones, computers, construction equipment etc. However, these materials are produced from non-renewable materials and are resistant to microbial degradation in the environment. Considerable research has therefore been carried out into the production of sustainable, biodegradable polymers, amenable to microbial catabolism to CO2 and H2O. A key group of microbial polyesters, widely considered as optimal replacement polymers, are the Polyhydroxyalkaonates (PHAs). Primary research in this area has focused on using recombinant pure cultures to optimise PHA yields, however, despite considerable success, the high costs of pure culture fermentation have thus far hindered the commercial viability of PHAs thus produced. In more recent years work has begun to focus on mixed cultures for the optimisation of PHA production, with waste incorporations offering optimal production cost reductions. The scale of dairy processing in Ireland, and the high organic load wastewaters generated, represent an excellent potential substrate for bioconversion to PHAs in a mixed culture system. The current study sought to investigate the potential for such bioconversion in a laboratory scale biological system and to establish key operational and microbial characteristics of same. Two sequencing batch reactors were set up and operated along the lines of an enhanced biological phosphate removal (EBPR) system, which has PHA accumulation as a key step within repeated rounds of anaerobic/aerobic cycling. Influents to the reactors varied only in the carbon sources provided. Reactor 1 received artificial wastewater with acetate alone, which is known to be readily converted to PHA in the anaerobic step of EBPR. Reactor 2 wastewater influent contained acetate and skim milk to imitate a dairy processing effluent. Chemical monitoring of nutrient remediation within the reactors as continuously applied and EBPR consistent performances observed. Qualitative analysis of the sludge was carried out using fluorescence microscopy with Nile Blue A lipophillic stain and PHA production was confirmed in both reactors. Quantitative analysis via HPLC detection of crotonic acid derivatives revealed the fluorescence to be short chain length Polyhydroxybutyrate, with biomass dry weight accumulations of 11% and 13% being observed in reactors 1 and 2, respectively. Gas Chromatography-Mass Spectrometry for medium chain length methyl ester derivatives revealed the presence of hydroxyoctanoic, -decanoic and -dodecanoic acids in reactor 1. Similar analyses in reactor 2 revealed monomers of 3-hydroxydodecenoic and 3-hydroxytetradecanoic acids. Investigation of the microbial ecology of both reactors as conducted in an attempt to identify key species potentially contributing to reactor performance. Culture dependent investigations indicated that quite different communities were present in both reactors. Reactor 1 isolates demonstrated the following species distributions Pseudomonas (82%), Delftia acidovorans (3%), Acinetobacter sp. (5%) Aminobacter sp., (3%) Bacillus sp. (3%), Thauera sp., (3%) and Cytophaga sp. (3%). Relative species distributions among reactor 2 profiled isolates were more evenly distributed between Pseudoxanthomonas (32%), Thauera sp (24%), Acinetobacter (24%), Citrobacter sp (8%), Lactococcus lactis (5%), Lysinibacillus (5%) and Elizabethkingia (2%). In both reactors Gammaproteobacteria dominated the cultured isolates. Culture independent 16S rRNA gene analyses revealed differing profiles for both reactors. Reactor 1 clone distribution was as follows; Zooglea resiniphila (83%), Zooglea oryzae (2%), Pedobacter composti (5%), Neissericeae sp. (2%) Rhodobacter sp. (2%), Runella defluvii (3%) and Streptococcus sp. (3%). RFLP based species distribution among the reactor 2 clones was as follows; Runella defluvii (50%), Zoogloea oryzae (20%), Flavobacterium sp. (9%), Simplicispira sp. (6%), Uncultured Sphingobacteria sp. (6%), Arcicella (6%) and Leadbetterella bysophila (3%). Betaproteobacteria dominated the 16S rRNA gene clones identified in both reactors. FISH analysis with Nile Blue dual staining resolved these divergent findings, identifying the Betaproteobacteria as dominant PHA accumulators within the reactor sludges, although species/strain specific allocations could not be made. GC analysis of the sludge had indicated the presence of both medium chain length as well short chain length PHAs accumulating in both reactors. In addition the cultured isolates from the reactors had been identified previously as mcl and scl PHA producers, respectively. Characterisations of the PHA monomer profiles of the individual isolates were therefore performed to screen for potential novel scl-mcl PHAs. Nitrogen limitation driven PHA accumulation in E2 minimal media revealed a greater propensity among isoates for mcl-pHA production. HPLC analysis indicated that PHB production was not a major feature of the reactor isolates and this was supported by the low presence of scl phaC1 genes among PCR screened isolates. A high percentage distribution of phaC2 mcl-PHA synthase genes was recorded, with the majority sharing high percentage homology with class II synthases from Pseudomonas sp. The common presence of a phaC2 homologue was not reflected in the production of a common polymer. Considerable variation was noted in both the monomer composition and ratios following GC analysis. While co-polymer production could not be demonstrated, potentially novel synthase substrate specificities were noted which could be exploited further in the future.
Resumo:
'The ecological emergency’ describes both our emergence into, and the way we relate within, a set of globally urgent circumstances, brought about through anthropogenic impact. I identify two phases to this emergency. Firstly, there is the anthropogenic impact itself, interpreted through various conceptual models. Secondly, however, is the increasingly entrenched commitment to divergent conceptual positions, that leads to a growing disparateness in attitudes, and a concurrent difficulty with finding any grounds for convergence in response. I begin by reviewing the environmental ethics literature in order to clarify which components of the implicit narratives and beliefs of different positions create the foundations for such disparateness of views. I identify the conceptual frameworks through which moral agency and human responsibility are viewed, and that justify an ethical response to the ecological emergency. In particular, I focus on Paul Taylor's thesis of 'respect for nature' as a framework for revising both the idea that we are ‘moral’ and the idea that we are ‘agents’ in this unique way, and I open to question the idea that any response to the ecological emergency need be couched in ethical terms. This revision leads me to formulate an alternative conceptual model that makes use of Timothy Morton’s idea of enmeshment. I propose that we dramatically revise our idea of moral agency using the idea of enmeshment as a starting point. I develop an alternative framework that locates our capacity for responsibility within our capacity for realisation, both in the sense of understanding, and of making real, sets of conditions within our enmeshment. I draw parallels between this idea of ‘realisation as agency’ and the work of Dōgen and other non-dualists. I then propose a revised understanding of ‘the good’ of systems from a biophysical perspective, and compare this with certain features of Asian traditions of thought. I consider the practical implications of these revisions, and I conclude that the act of paying close attention, or realising, contains our agency, as does the attitude, or manner, with which we focus. This gives us the basis for a convergent response to the ecological emergency: the way of our engagement that is the key to responding to the ecological emergency
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain