941 resultados para Capacity Analysis, Capacity Expansion, Railways
Resumo:
Ioan Fazey, John A. Fazey, Joern Fischer, Kate Sherren, John Warren, Reed F. Noss, Stephen R. Dovers (2007) Adaptive capacity and learning to learn as leverage for social?ecological resilience. Frontiers in Ecology and the Environment 5(7),375-380. RAE2008
Resumo:
Network traffic arises from the superposition of Origin-Destination (OD) flows. Hence, a thorough understanding of OD flows is essential for modeling network traffic, and for addressing a wide variety of problems including traffic engineering, traffic matrix estimation, capacity planning, forecasting and anomaly detection. However, to date, OD flows have not been closely studied, and there is very little known about their properties. We present the first analysis of complete sets of OD flow timeseries, taken from two different backbone networks (Abilene and Sprint-Europe). Using Principal Component Analysis (PCA), we find that the set of OD flows has small intrinsic dimension. In fact, even in a network with over a hundred OD flows, these flows can be accurately modeled in time using a small number (10 or less) of independent components or dimensions. We also show how to use PCA to systematically decompose the structure of OD flow timeseries into three main constituents: common periodic trends, short-lived bursts, and noise. We provide insight into how the various constituents contribute to the overall structure of OD flows and explore the extent to which this decomposition varies over time.
Resumo:
The last 30 years have seen Fuzzy Logic (FL) emerging as a method either complementing or challenging stochastic methods as the traditional method of modelling uncertainty. But the circumstances under which FL or stochastic methods should be used are shrouded in disagreement, because the areas of application of statistical and FL methods are overlapping with differences in opinion as to when which method should be used. Lacking are practically relevant case studies comparing these two methods. This work compares stochastic and FL methods for the assessment of spare capacity on the example of pharmaceutical high purity water (HPW) utility systems. The goal of this study was to find the most appropriate method modelling uncertainty in industrial scale HPW systems. The results provide evidence which suggests that stochastic methods are superior to the methods of FL in simulating uncertainty in chemical plant utilities including HPW systems in typical cases whereby extreme events, for example peaks in demand, or day-to-day variation rather than average values are of interest. The average production output or other statistical measures may, for instance, be of interest in the assessment of workshops. Furthermore the results indicate that the stochastic model should be used only if found necessary by a deterministic simulation. Consequently, this thesis concludes that either deterministic or stochastic methods should be used to simulate uncertainty in chemical plant utility systems and by extension some process system because extreme events or the modelling of day-to-day variation are important in capacity extension projects. Other reasons supporting the suggestion that stochastic HPW models are preferred to FL HPW models include: 1. The computer code for stochastic models is typically less complex than a FL models, thus reducing code maintenance and validation issues. 2. In many respects FL models are similar to deterministic models. Thus the need for a FL model over a deterministic model is questionable in the case of industrial scale HPW systems as presented here (as well as other similar systems) since the latter requires simpler models. 3. A FL model may be difficult to "sell" to an end-user as its results represent "approximate reasoning" a definition of which is, however, lacking. 4. Stochastic models may be applied with some relatively minor modifications on other systems, whereas FL models may not. For instance, the stochastic HPW system could be used to model municipal drinking water systems, whereas the FL HPW model should or could not be used on such systems. This is because the FL and stochastic model philosophies of a HPW system are fundamentally different. The stochastic model sees schedule and volume uncertainties as random phenomena described by statistical distributions based on either estimated or historical data. The FL model, on the other hand, simulates schedule uncertainties based on estimated operator behaviour e.g. tiredness of the operators and their working schedule. But in a municipal drinking water distribution system the notion of "operator" breaks down. 5. Stochastic methods can account for uncertainties that are difficult to model with FL. The FL HPW system model does not account for dispensed volume uncertainty, as there appears to be no reasonable method to account for it with FL whereas the stochastic model includes volume uncertainty.
Resumo:
In 1966, Roy Geary, Director of the ESRI, noted “the absence of any kind of import and export statistics for regions is a grave lacuna” and further noted that if regional analyses were to be developed then regional Input-Output Tables must be put on the “regular statistical assembly line”. Forty-five years later, the lacuna lamented by Geary still exists and remains the most significant challenge to the construction of regional Input-Output Tables in Ireland. The continued paucity of sufficient regional data to compile effective regional Supply and Use and Input-Output Tables has retarded the capacity to construct sound regional economic models and provide a robust evidence base with which to formulate and assess regional policy. This study makes a first step towards addressing this gap by presenting the first set of fully integrated, symmetric, Supply and Use and domestic Input-Output Tables compiled for the NUTS 2 regions in Ireland: The Border, Midland and Western region and the Southern & Eastern region. These tables are general purpose in nature and are consistent fully with the official national Supply & Use and Input-Output Tables, and the regional accounts. The tables are constructed using a survey-based or bottom-up approach rather than employing modelling techniques, yielding more robust and credible tables. These tables are used to present a descriptive statistical analysis of the two administrative NUTS 2 regions in Ireland, drawing particular attention to the underlying structural differences of regional trade balances and composition of Gross Value Added in those regions. By deriving regional employment multipliers, Domestic Demand Employment matrices are constructed to quantify and illustrate the supply chain impact on employment. In the final part of the study, the predictive capability of the Input-Output framework is tested over two time periods. For both periods, the static Leontief production function assumptions are relaxed to allow for labour productivity. Comparative results from this experiment are presented.
Resumo:
The concept of police accountability is not susceptible to a universal or concise definition. In the context of this thesis it is treated as embracing two fundamental components. First, it entails an arrangement whereby an individual, a minority and the whole community have the opportunity to participate meaningfully in the formulation of the principles and policies governing police operations. Second, it presupposes that those who have suffered as victims of unacceptable police behaviour should have an effective remedy. These ingredients, however, cannot operate in a vacuum. They must find an accommodation with the equally vital requirement that the burden of accountability should not be so demanding that the delivery of an effective police service is fatally impaired. While much of the current debate on police accountability in Britain and the USA revolves around the issue of where the balance should be struck in this accommodation, Ireland lacks the very foundation for such a debate as it suffers from a serious deficit in research and writing on police generally. This thesis aims to fill that gap by laying the foundations for an informed debate on police accountability and related aspects of police in Ireland. Broadly speaking the thesis contains three major interrelated components. The first is concerned with the concept of police in Ireland and the legal, constitutional and political context in which it operates. This reveals that although the Garda Siochana is established as a national force the legal prescriptions concerning its role and governance are very vague. Although a similar legislative format in Britain, and elsewhere, have been interpreted as conferring operational autonomy on the police it has not stopped successive Irish governments from exercising close control over the police. The second component analyses the structure and operation of the traditional police accountability mechanisms in Ireland; namely the law and the democratic process. It concludes that some basic aspects of the peculiar legal, constitutional and political structures of policing seriously undermine their capacity to deliver effective police accountability. In the case of the law, for example, the status of, and the broad discretion vested in, each individual member of the force ensure that the traditional legal actions cannot always provide redress where individuals or collective groups feel victimised. In the case of the democratic process the integration of the police into the excessively centralised system of executive government, coupled with the refusal of the Minister for Justice to accept responsibility for operational matters, project a barrier between the police and their accountability to the public. The third component details proposals on how the current structures of police accountability in Ireland can be strengthened without interfering with the fundamentals of the law, the democratic process or the legal and constitutional status of the police. The key elements in these proposals are the establishment of an independent administrative procedure for handling citizen complaints against the police and the establishment of a network of local police-community liaison councils throughout the country coupled with a centralised parliamentary committee on the police. While these proposals are analysed from the perspective of maximising the degree of police accountability to the public they also take into account the need to ensure that the police capacity to deliver an effective police service is not unduly impaired as a result.
Resumo:
Although Common Pool Resources (CPRs) make up a significant share of total income for rural households in Ethiopia and elsewhere in developing world, limited access to these resources and environmental degradation threaten local livelihoods. As a result, the issues of management, governance of CPRs and how to prevent their over-exploitation are of great importance for development policy. This study examines the current state and dynamics of CPRs and overall resource governance system of the Lake Tana sub-basin. This research employed the modified form of Institutional Analysis and Development (IAD) framework. The framework integrates the concept of Socio-Ecological Systems (SES) and Interactive Governance (IG) perspectives where social actors, institutions, the politico-economic context, discourses and ecological features across governance and government levels were considered. It has been observed that overexploitation, degradation and encroachment of CPRs have increased dramatically and this threatens the sustainability of Lake Tana ecosystem. The stakeholder analysis result reveals that there are multiple stakeholders with diverse interest in and power over CPRs. The analysis of institutional arrangements reveals that the existing formal rules and regulations governing access to and control over CPRs could not be implemented and were not effective to legally bind and govern CPR user’s behavior at the operational level. The study also shows that a top-down and non-participatory policy formulation, law and decision making process overlooks the local contexts (local knowledge and informal institutions). The outcomes of examining the participation of local resource users, as an alternative to a centralized, command-and-control, and hierarchical approach to resource management and governance, have called for a fundamental shift in CPR use, management and governance to facilitate the participation of stakeholders in decision making. Therefore, establishing a multi-level stakeholder governance system as an institutional structure and process is necessary to sustain stakeholder participation in decision-making regarding CPR use, management and governance.
Resumo:
It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain
Resumo:
While numerous studies find that deep-saline sandstone aquifers in the United States could store many decades worth of the nation's current annual CO 2 emissions, the likely cost of this storage (i.e. the cost of storage only and not capture and transport costs) has been harder to constrain. We use publicly available data of key reservoir properties to produce geo-referenced rasters of estimated storage capacity and cost for regions within 15 deep-saline sandstone aquifers in the United States. The rasters reveal the reservoir quality of these aquifers to be so variable that the cost estimates for storage span three orders of magnitude and average>$100/tonne CO 2. However, when the cost and corresponding capacity estimates in the rasters are assembled into a marginal abatement cost curve (MACC), we find that ~75% of the estimated storage capacity could be available for<$2/tonne. Furthermore, ~80% of the total estimated storage capacity in the rasters is concentrated within just two of the aquifers-the Frio Formation along the Texas Gulf Coast, and the Mt. Simon Formation in the Michigan Basin, which together make up only ~20% of the areas analyzed. While our assessment is not comprehensive, the results suggest there should be an abundance of low-cost storage for CO 2 in deep-saline aquifers, but a majority of this storage is likely to be concentrated within specific regions of a smaller number of these aquifers. © 2011 Elsevier B.V.
Resumo:
Nolan and Temple Lang argue that “the ability to express statistical computations is an es- sential skill.” A key related capacity is the ability to conduct and present data analysis in a way that another person can understand and replicate. The copy-and-paste workflow that is an artifact of antiquated user-interface design makes reproducibility of statistical analysis more difficult, especially as data become increasingly complex and statistical methods become increasingly sophisticated. R Markdown is a new technology that makes creating fully-reproducible statistical analysis simple and painless. It provides a solution suitable not only for cutting edge research, but also for use in an introductory statistics course. We present experiential and statistical evidence that R Markdown can be used effectively in introductory statistics courses, and discuss its role in the rapidly-changing world of statistical computation.
Resumo:
Tissue-engineered skeletal muscle can serve as a physiological model of natural muscle and a potential therapeutic vehicle for rapid repair of severe muscle loss and injury. Here, we describe a platform for engineering and testing highly functional biomimetic muscle tissues with a resident satellite cell niche and capacity for robust myogenesis and self-regeneration in vitro. Using a mouse dorsal window implantation model and transduction with fluorescent intracellular calcium indicator, GCaMP3, we nondestructively monitored, in real time, vascular integration and the functional state of engineered muscle in vivo. During a 2-wk period, implanted engineered muscle exhibited a steady ingrowth of blood-perfused microvasculature along with an increase in amplitude of calcium transients and force of contraction. We also demonstrated superior structural organization, vascularization, and contractile function of fully differentiated vs. undifferentiated engineered muscle implants. The described in vitro and in vivo models of biomimetic engineered muscle represent enabling technology for novel studies of skeletal muscle function and regeneration.
Resumo:
It is commonly accepted that aerobic exercise increases hippocampal neurogenesis, learning and memory, as well as stress resiliency. However, human populations are widely variable in their inherent aerobic fitness as well as their capacity to show increased aerobic fitness following a period of regimented exercise. It is unclear whether these inherent or acquired components of aerobic fitness play a role in neurocognition. To isolate the potential role of inherent aerobic fitness, we exploited a rat model of high (HCR) and low (LCR) inherent aerobic capacity for running. At a baseline, HCR rats have two- to three-fold higher aerobic capacity than LCR rats. We found that HCR rats also had two- to three- fold more young neurons in the hippocampus than LCR rats as well as rats from the heterogeneous founder population. We then asked whether this enhanced neurogenesis translates to enhanced hippocampal cognition, as is typically seen in exercise-trained animals. Compared to LCR rats, HCR rats performed with high accuracy on tasks designed to test neurogenesis-dependent pattern separation ability by examining investigatory behavior between very similar objects or locations. To investigate whether an aerobic response to exercise is required for exercise-induced changes in neurogenesis and cognition, we utilized a rat model of high (HRT) and low (LRT) aerobic response to treadmill training. At a baseline, HRT and LRT rats have comparable aerobic capacity as measured by a standard treadmill fit test, yet after a standardized training regimen, HRT but not LRT rats robustly increase their aerobic capacity for running. We found that sedentary LRT and HRT rats had equivalent levels of hippocampal neurogenesis, but only HRT rats had an elevation in the number of young neurons in the hippocampus following training, which was positively correlated with accuracy on pattern separation tasks. Taken together, these data suggest that a significant elevation in aerobic capacity is necessary for exercise-induced hippocampal neurogenesis and hippocampal neurogenesis-dependent learning and memory. To investigate the potential for high aerobic capacity to be neuroprotective, doxorubicin chemotherapy was administered to LCR and HCR rats. While doxorubicin induces a progressive decrease in aerobic capacity as well as neurogenesis, HCR rats remain at higher levels on those measures compared to even saline-treated LCR rats. HCR and LCR rats that received exercise training throughout doxorubicin treatment demonstrated positive effects of exercise on aerobic capacity and neurogenesis, regardless of inherent aerobic capacity. Overall, these findings demonstrate that inherent and acquired components of aerobic fitness play a crucial role not only in the cardiorespiratory system but also the fitness of the brain.
Resumo:
Dopamine is an important central nervous system transmitter that functions through two classes of receptors (D1 and D2) to influence a diverse range of biological processes in vertebrates. With roles in regulating neural activity, behavior, and gene expression, there has been great interest in understanding the function and evolution dopamine and its receptors. In this study, we use a combination of sequence analyses, microsynteny analyses, and phylogenetic relationships to identify and characterize both the D1 (DRD1A, DRD1B, DRD1C, and DRD1E) and D2 (DRD2, DRD3, and DRD4) dopamine receptor gene families in 43 recently sequenced bird genomes representing the major ordinal lineages across the avian family tree. We show that the common ancestor of all birds possessed at least seven D1 and D2 receptors, followed by subsequent independent losses in some lineages of modern birds. Through comparisons with other vertebrate and invertebrate species we show that two of the D1 receptors, DRD1A and DRD1B, and two of the D2 receptors, DRD2 and DRD3, originated from a whole genome duplication event early in the vertebrate lineage, providing the first conclusive evidence of the origin of these highly conserved receptors. Our findings provide insight into the evolutionary development of an important modulatory component of the central nervous system in vertebrates, and will help further unravel the complex evolutionary and functional relationships among dopamine receptors.
Resumo:
The Miyun Reservoir, the only surface water source for Beijing city, has experienced water supply decline in recent decades. Previous studies suggest that both land use change and climate contribute to the changes of water supply in this critical watershed. However, the specific causes of the decline in the Miyun Reservoir are debatable under a non-stationary climate in the past 4 decades. The central objective of this study was to quantify the separate and collective contributions of land use change and climate variability to the decreasing inflow into the Miyun Reservoir during 1961–2008. Different from previous studies on this watershed, we used a comprehensive approach to quantify the timing of changes in hydrology and associated environmental variables using the long-term historical hydrometeorology and remote-sensing-based land use records. To effectively quantify the different impacts of the climate variation and land use change on streamflow during different sub-periods, an annual water balance model (AWB), the climate elasticity model (CEM), and a rainfall–runoff model (RRM) were employed to conduct attribution analysis synthetically. We found a significant (p < 0.01) decrease in annual streamflow, a significant positive trend in annual potential evapotranspiration (p < 0.01), and an insignificant (p > 0.1) negative trend in annual precipitation during 1961–2008. We identified two streamflow breakpoints, 1983 and 1999, by the sequential Mann–Kendall test and double-mass curve. Climate variability alone did not explain the decrease in inflow to the Miyun Reservoir. Reduction of water yield was closely related to increase in actual evapotranspiration due to the expansion of forestland and reduction in cropland and grassland, and was likely exacerbated by increased water consumption for domestic and industrial uses in the basin. The contribution to the observed streamflow decline from land use change fell from 64–92 % during 1984–1999 to 36–58 % during 2000–2008, whereas the contribution from climate variation climbed from 8–36 % during the 1984–1999 to 42–64 % during 2000–2008. Model uncertainty analysis further demonstrated that climate warming played a dominant role in streamflow reduction in the most recent decade (i.e., 2000s). We conclude that future climate change and variability will further challenge the water supply capacity of the Miyun Reservoir to meet water demand. A comprehensive watershed management strategy needs to consider the climate variations besides vegetation management in the study basin.
Resumo:
info:eu-repo/semantics/published