938 resultados para electricity distribution network
Resumo:
The increasing demand in electricity and decrease forecast, increasingly, of fossil fuel reserves, as well as increasing environmental concern in the use of these have generated a concern about the quality of electricity generation, making it well welcome new investments in generation through alternative, clean and renewable sources. Distributed generation is one of the main solutions for the independent and selfsufficient generating systems, such as the sugarcane industry. This sector has grown considerably, contributing expressively in the production of electricity to the distribution networks. Faced with this situation, one of the main objectives of this study is to propose the implementation of an algorithm to detect islanding disturbances in the electrical system, characterized by situations of under- or overvoltage. The algorithm should also commonly quantize the time that the system was operating in these conditions, to check the possible consequences that will be caused in the electric power system. In order to achieve this it used the technique of wavelet multiresolution analysis (AMR) for detecting the generated disorders. The data obtained can be processed so as to be used for a possible predictive maintenance in the protection equipment of electrical network, since they are prone to damage on prolonged operation under abnormal conditions of frequency and voltage.
Resumo:
Recent research into resting-state functional magnetic resonance imaging (fMRI) has shown that the brain is very active during rest. This thesis work utilizes blood oxygenation level dependent (BOLD) signals to investigate the spatial and temporal functional network information found within resting-state data, and aims to investigate the feasibility of extracting functional connectivity networks using different methods as well as the dynamic variability within some of the methods. Furthermore, this work looks into producing valid networks using a sparsely-sampled sub-set of the original data.
In this work we utilize four main methods: independent component analysis (ICA), principal component analysis (PCA), correlation, and a point-processing technique. Each method comes with unique assumptions, as well as strengths and limitations into exploring how the resting state components interact in space and time.
Correlation is perhaps the simplest technique. Using this technique, resting-state patterns can be identified based on how similar the time profile is to a seed region’s time profile. However, this method requires a seed region and can only identify one resting state network at a time. This simple correlation technique is able to reproduce the resting state network using subject data from one subject’s scan session as well as with 16 subjects.
Independent component analysis, the second technique, has established software programs that can be used to implement this technique. ICA can extract multiple components from a data set in a single analysis. The disadvantage is that the resting state networks it produces are all independent of each other, making the assumption that the spatial pattern of functional connectivity is the same across all the time points. ICA is successfully able to reproduce resting state connectivity patterns for both one subject and a 16 subject concatenated data set.
Using principal component analysis, the dimensionality of the data is compressed to find the directions in which the variance of the data is most significant. This method utilizes the same basic matrix math as ICA with a few important differences that will be outlined later in this text. Using this method, sometimes different functional connectivity patterns are identifiable but with a large amount of noise and variability.
To begin to investigate the dynamics of the functional connectivity, the correlation technique is used to compare the first and second halves of a scan session. Minor differences are discernable between the correlation results of the scan session halves. Further, a sliding window technique is implemented to study the correlation coefficients through different sizes of correlation windows throughout time. From this technique it is apparent that the correlation level with the seed region is not static throughout the scan length.
The last method introduced, a point processing method, is one of the more novel techniques because it does not require analysis of the continuous time points. Here, network information is extracted based on brief occurrences of high or low amplitude signals within a seed region. Because point processing utilizes less time points from the data, the statistical power of the results is lower. There are also larger variations in DMN patterns between subjects. In addition to boosted computational efficiency, the benefit of using a point-process method is that the patterns produced for different seed regions do not have to be independent of one another.
This work compares four unique methods of identifying functional connectivity patterns. ICA is a technique that is currently used by many scientists studying functional connectivity patterns. The PCA technique is not optimal for the level of noise and the distribution of the data sets. The correlation technique is simple and obtains good results, however a seed region is needed and the method assumes that the DMN regions is correlated throughout the entire scan. Looking at the more dynamic aspects of correlation changing patterns of correlation were evident. The last point-processing method produces a promising results of identifying functional connectivity networks using only low and high amplitude BOLD signals.
Resumo:
Oscillating Water Column (OWC) is one type of promising wave energy devices due to its obvious advantage over many other wave energy converters: no moving component in sea water. Two types of OWCs (bottom-fixed and floating) have been widely investigated, and the bottom-fixed OWCs have been very successful in several practical applications. Recently, the proposal of massive wave energy production and the availability of wave energy have pushed OWC applications from near-shore to deeper water regions where floating OWCs are a better choice. For an OWC under sea waves, the air flow driving air turbine to generate electricity is a random process. In such a working condition, single design/operation point is nonexistent. To improve energy extraction, and to optimise the performance of the device, a system capable of controlling the air turbine rotation speed is desirable. To achieve that, this paper presents a short-term prediction of the random, process by an artificial neural network (ANN), which can provide near-future information for the control system. In this research, ANN is explored and tuned for a better prediction of the airflow (as well as the device motions for a wide application). It is found that, by carefully constructing ANN platform and optimizing the relevant parameters, ANN is capable of predicting the random process a few steps ahead of the real, time with a good accuracy. More importantly, the tuned ANN works for a large range of different types of random, process.
Resumo:
The GloboLakes project, a global observatory of lake responses to environmental change, aims to exploit current satellite missions and long remote-sensing archives to synoptically study multiple lake ecosystems, assess their current condition, reconstruct past trends to system trajectories, and assess lake sensitivity to multiple drivers of change. Here we describe the selection protocol for including lakes in the global observatory based upon remote-sensing techniques and an initial pool of the largest 3721 lakes and reservoirs in the world, as listed in the Global Lakes and Wetlands Database. An 18-year-long archive of satellite data was used to create spatial and temporal filters for the identification of waterbodies that are appropriate for remote-sensing methods. Further criteria were applied and tested to ensure the candidate sites span a wide range of ecological settings and characteristics; a total 960 lakes, lagoons, and reservoirs were selected. The methodology proposed here is applicable to new generation satellites, such as the European Space Agency Sentinel-series.
Resumo:
Ticket distribution channels for live music events have been revolutionised through the increased take-up of internet technologies, and the music supply-chain has evolved into a multi-channel value network. The assumption that this creates increased consumer autonomy and improved service quality is explored here through a case-study of the ticket pre-sale for the US leg of the Depeche Mode 2005–06 World Tour, which utilises an innovative virtual channel strategy, promoted as a service to loyal fans. A multi-method analysis, adopting Kozinets' (2002) Kozinets, R. V. 2002. The field behind the screen: using netnography for marketing research in online communities. Journal of Marketing Research, 39: 61–72. [CrossRef], [Web of Science ®] netnography methodology, is employed to map responses of the band's serious fan base on an internet message board (IMB) throughout the tour pre-sale. The analysis focuses on concerns of pricing, ethics, scope of the offer, use of technology, service quality and perceived brand performance fit of channel partners. Findings indicate that fans behaviour is unpredictable in response to channel partners' performance, and that such offers need careful management to avoid alienation of loyal consumers.
Resumo:
Based on an original and comprehensive database of all feature fiction films produced in Mercosur between 2004 and 2012, the paper analyses whether the Mercosur film industry has evolved towards an integrated and culturally more diverse market. It provides a summary of policy opportunities in terms of integration and diversity, emphasizing the limiter role played by regional policies. It then shows that although the Mercosur film industry remains rather disintegrated, it tends to become more integrated and culturally more diverse. From a methodological point of view, the combination of Social Network Analysis and the Stirling Model opens up interesting research tracks to analyse creative industries in terms of their market integration and their cultural diversity.
Resumo:
The Enred@te initiative, created by Red Cross, the Vodafone Foundation and the TECSOS Foundation, emerged as an evolution of a previous project that developed and piloted a video-communication solution with older adults, using a system installed in their own televisions. Following the success of this first initiative, it was decided to advance toward a more flexible, robust, easy-to-use and high-quality solution, producing a social network accessible through tablets. Older adults can use the network to video-communicate with other older adults and stay informed on various topics of interest. Additionally, a new innovation incorporates the participation of virtual volunteers, a part of the network that promotes its use in an inclusive and participative manner. This solution was also piloted in 2014 with positive results and work to turn it into a service that can reach older adults through the Red Cross is currently on-going.
Resumo:
Future power systems are expected to integrate large-scale stochastic and intermittent generation and load due to reduced use of fossil fuel resources, including renewable energy sources (RES) and electric vehicles (EV). Inclusion of such resources poses challenges for the dynamic stability of synchronous transmission and distribution networks, not least in terms of generation where system inertia may not be wholly governed by large-scale generation but displaced by small-scale and localised generation. Energy storage systems (ESS) can limit the impact of dispersed and distributed generation by offering supporting reserve while accommodating large-scale EV connection; the latter (load) also participating in storage provision. In this paper, a local energy storage system (LESS) is proposed. The structure, requirement and optimal sizing of the LESS are discussed. Three operating modes are detailed, including: 1) storage pack management; 2) normal operation; and 3) contingency operation. The proposed LESS scheme is evaluated using simulation studies based on data obtained from the Northern Ireland regional and residential network.
Resumo:
A network connected host is expected to generate/respond to applications and protocols specific messages. Billions of Euro of electricity is wasted to keep idle hosts powered up 24/7 just to maintain network presence. This short paper describes the design of our cooperative Network Connectivity Proxy (NCP) that can impersonate sleeping hosts and responds to packets on their behalf as they were connected and fully operational. Thus, NCP is in fact an efficient approach to reduce network energy waste.
Resumo:
Near-surface air temperature is an important determinant of the surface energy balance of glaciers and is often represented by a constant linear temperature gradients (TGs) in models. Spatiotemporal variability in 2 m air temperature was measured across the debris-covered Miage Glacier, Italy, over an 89 d period during the 2014 ablation season using a network of 19 stations. Air temperature was found to be strongly dependent upon elevation for most stations, even under varying meteorological conditions and at different times of day, and its spatial variability was well explained by a locally derived mean linear TG (MG–TG) of −0.0088°C m−1. However, local temperature depressions occurred over areas of very thin or patchy debris cover. The MG–TG, together with other air TGs, extrapolated from both on- and off-glacier sites, were applied in a distributed energy-balance model. Compared with piecewise air temperature extrapolation from all on-glacier stations, modelled ablation, using the MG–TG, increased by <1%, increasing to >4% using the environmental ‘lapse rate’. Ice melt under thick debris was relatively insensitive to air temperature, while the effects of different temperature extrapolation methods were strongest at high elevation sites of thin and patchy debris cover.
Resumo:
Most major cities in the eastern United States have air quality deemed unhealthy by the EPA under a set of regulations known as the National Ambient Air Quality Standards (NAAQS). The worst air quality in Maryland is measured in Edgewood, MD, a small community located along the Chesapeake Bay and generally downwind of Baltimore during hot, summertime days. Direct measurements and numerical simulations were used to investigate how meteorology and chemistry conspire to create adverse levels of photochemical smog especially at this coastal location. Ozone (O3) and oxidized reactive nitrogen (NOy), a family of ozone precursors, were measured over the Chesapeake Bay during a ten day experiment in July 2011 to better understand the formation of ozone over the Bay and its impact on coastal communities such as Edgewood. Ozone over the Bay during the afternoon was 10% to 20% higher than the closest upwind ground sites. A combination of complex boundary layer dynamics, deposition rates, and unaccounted marine emissions play an integral role in the regional maximum of ozone over the Bay. The CAMx regional air quality model was assessed and enhanced through comparison with data from NASA’s 2011 DISCOVER-AQ field campaign. Comparisons show a model overestimate of NOy by +86.2% and a model underestimate of formaldehyde (HCHO) by –28.3%. I present a revised model framework that better captures these observations and the response of ozone to reductions of precursor emissions. Incremental controls on electricity generating stations will produce greater benefits for surface ozone while additional controls on mobile sources may yield less benefit because cars emit less pollution than expected. Model results also indicate that as ozone concentrations improve with decreasing anthropogenic emissions, the photochemical lifetime of tropospheric ozone increases. The lifetime of ozone lengthens because the two primary gas-phase sinks for odd oxygen (Ox ≈ NO2 + O3) – attack by hydroperoxyl radicals (HO2) on ozone and formation of nitrate – weaken with decreasing pollutant emissions. This unintended consequence of air quality regulation causes pollutants to persist longer in the atmosphere, and indicates that pollutant transport between states and countries will likely play a greater role in the future.
Resumo:
Schedules can be built in a similar way to a human scheduler by using a set of rules that involve domain knowledge. This paper presents an Estimation of Distribution Algorithm (EDA) for the nurse scheduling problem, which involves choosing a suitable scheduling rule from a set for the assignment of each nurse. Unlike previous work that used Genetic Algorithms (GAs) to implement implicit learning, the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The EDA is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, a new rule string has been obtained. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed approach might be suitable for other scheduling problems.
Resumo:
Hematopoiesis is the tightly controlled and complex process in which the entire blood system is formed and maintained by a rare pool of hematopoietic stem cells (HSCs), and its dysregulation results in the formation of leukaemia. TRIB2, a member of the Tribbles family of serine/threonine pseudokinases, has been implicated in a variety of cancers and is a potent murine oncogene that induces acute myeloid leukaemia (AML) in vivo via modulation of the essential myeloid transcription factor CCAAT-enhancer binding protein α (C/EBPα). C/EBPα, which is crucial for myeloid cell differentiation, is commonly dysregulated in a variety of cancers, including AML. Two isoforms of C/EBPα exist - the full-length p42 isoform, and the truncated oncogenic p30 isoform. TRIB2 has been shown to selectively degrade the p42 isoform of C/EBPα and induce p30 expression in AML. In this study, overexpression of the p30 isoform in a bone marrow transplant (BMT) leads to perturbation of myelopoiesis, and in the presence of physiological levels of p42, this oncogene exhibited weak transformative ability. It was also shown by BMT that despite their degradative relationship, expression of C/EBPα was essential for TRIB2 mediated leukaemia. A conditional mouse model was used to demonstrate that oncogenic p30 cooperates with TRIB2 to reduce disease latency, only in the presence of p42. At the molecular level, a ubiquitination assay was used to show that TRIB2 degrades p42 by K48-mediated proteasomal ubiquitination and was unable to ubiquitinate p30. Mutation of a critical lysine residue in the C-terminus of C/EBPα abrogated TRIB2 mediated C/EBPα ubiquitination suggesting that this site, which is frequently mutated in AML, is the site at which TRIB2 mediates its degradative effects. The TRIB2-C/EBPα axis was effectively targeted by proteasome inhibition. AML is a very difficult disease to target therapeutically due to the extensive array of chromosomal translocations and genetic aberrations that contribute to the disease. The cell from which a specific leukaemia arises, or leukaemia initiating cell (LIC), can affect the phenotype and chemotherapeutic response of the resultant disease. The LIC has been elucidated for some common oncogenes but it is unknown for TRIB2. The data presented in this thesis investigate the ability of the oncogene TRIB2 to transform hematopoietic stem and progenitor cells in vitro and in vivo. TRIB2 overexpression conferred in vitro serially replating ability to all stem and progenitor cells studied. Upon transplantation, only TRIB2 overexpressing HSCs and granulocyte/macrophage progenitors (GMPs) resulted in the generation of leukaemia in vivo. TRIB2 induced a mature myeloid leukaemia from the GMP, and a mixed lineage leukaemia from the HSC. As such the role of TRIB2 in steady state hematopoiesis was also explored using a Trib2-/- mouse and it was determined that loss of Trib2 had no effect on lineage distribution in the hematopoietic compartment under steady-state conditions. The process of hematopoiesis is controlled by a host of lineage restricted transcription factors. Recently members of the Nuclear Factor 1 family of transcription factors (NFIA, NFIB, NFIC and NFIX) have been implicated in hematopoiesis. Little is known about the role of NFIX in lineage determination. Here we describe a novel role for NFIX in lineage fate determination. In human and murine datasets the expression of Nfix was shown to decrease as cells differentiated along the lymphoid pathway. NFIX overexpression resulted in enhanced myelopoiesis in vivo and in vitro and a block in B cell development at the pre-pro-B cell stage. Loss of NFIX resulted in disruption of myeloid and lymphoid differentiation in vivo. These effects on stem and progenitor cell fate correlated with changes in the expression levels of key transcription factors involved in hematopoietic differentiation including a 15-fold increase in Cebpa expression in Nfix overexpressing cells. The data presented support a role for NFIX as an important transcription factor influencing hematopoietic lineage specification. The identification of NFIX as a novel transcription factor influencing lineage determination will lead to further study of its role in hematopoiesis, and contribute to a better understanding of the process of differentiation. Elucidating the relationship between TRIB2 and C/EBPα not only impacts on our understanding of the pathophysiology of AML but is also relevant in other cancer types including lung and liver cancer. Thus in summary, the data presented in this thesis provide important insights into key areas which will facilitate the development of future therapeutic approaches in cancer treatment.
Resumo:
Rimicaris exoculata is a deep-sea hydrothermal vent shrimp which enlarged gill chamber houses a complex trophic epibiotic community. Its gut harbours an autochthonous and distinct microbial community. This species dominates hydrothermal ecosystems megafauna along the Mid-Atlantic Ridge, regardless of contrasted geochemical conditions prevailing in them. Here, the resident gut epibiont community at four contrasted hydrothermal vent sites (Rainbow/TAG/Logatchev/Ashadze) was analysed and compiled with previous data to evaluate the possible influence of site location, using 16S rRNA surveys and microscopic observations (TEM, SEM and FISH analyses). Filamentous epibionts inserted between the epithelial cells microvilli were observed on all examined samples. Results confirmed resident gut community affiliation to Deferribacteres, Mollicutes, Epsilonproteobacteria and to a lesser extent Gammaproteobacteria lineages. Still a single Deferribacteres phylotype was retrieved at all sites. Four Mollicutes-related OTUs were distinguished, one being only identified on Rainbow specimens. The topology of ribotypes median-joining networks illustrated a community diversification possibly following demographic expansions, suggesting a more ancient evolutionary history and/or a larger effective population size at Rainbow. Finally, the gill chamber community distribution was also analysed through ribotypes networks based on sequences from R. exoculata collected at Rainbow/Snake Pit/TAG/Logatchev/Ashadze sites. Results allow refining hypotheses on the epibiont role and transmission pathways.
Resumo:
Schedules can be built in a similar way to a human scheduler by using a set of rules that involve domain knowledge. This paper presents an Estimation of Distribution Algorithm (EDA) for the nurse scheduling problem, which involves choosing a suitable scheduling rule from a set for the assignment of each nurse. Unlike previous work that used Genetic Algorithms (GAs) to implement implicit learning, the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The EDA is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, a new rule string has been obtained. Another set of rule strings will be generated in this way, some of which will replace previous strings based on fitness selection. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed approach might be suitable for other scheduling problems.