960 resultados para dark matter theory
Resumo:
This paper presents the stability analysis for a distribution static compensator (DSTATCOM) that operates in current control mode based on bifurcation theory. Bifurcations delimit the operating zones of nonlinear circuits and, hence, the capability to compute these bifurcations is of important interest for practical design. A control design for the DSTATCOM is proposed. Along with this control, a suitable mathematical representation of the DSTATCOM is proposed to carry out the bifurcation analysis efficiently. The stability regions in the Thevenin equivalent plane are computed for different power factors at the point of common coupling. In addition, the stability regions in the control gain space, as well as the contour lines for different Floquet multipliers are computed. It is demonstrated through bifurcation analysis that the loss of stability in the DSTATCOM is due to the emergence of a Neimark bifurcation. The observations are verified through simulation studies.
Resumo:
Agriculture's contribution to radiative forcing is principally through its historical release of carbon in soil and vegetation to the atmosphere and through its contemporary release of nitrous oxide (N2O) and methane (CHM4). The sequestration of soil carbon in soils now depleted in soil organic matter is a well-known strategy for mitigating the buildup of CO2 in the atmosphere. Less well-recognized are other mitigation potentials. A full-cost accounting of the effects of agriculture on greenhouse gas emissions is necessary to quantify the relative importance of all mitigation options. Such an analysis shows nitrogen fertilizer, agricultural liming, fuel use, N2O emissions, and CH4 fluxes to have additional significant potential for mitigation. By evaluating all sources in terms of their global warming potential it becomes possible to directly evaluate greenhouse policy options for agriculture. A comparison of temperate and tropical systems illustrates some of these options.
Resumo:
Global climate change may induce accelerated soil organic matter (SOM) decomposition through increased soil temperature, and thus impact the C balance in soils. We hypothesized that compartmentalization of substrates and decomposers in the soil matrix would decrease SOM sensitivity to temperature. We tested our hypothesis with three short-term laboratory incubations with differing physical protection treatments conducted at different temperatures. Overall, CO2 efflux increased with temperature, but responses among physical protection treatments were not consistently different. Similar respiration quotient (Q(10)) values across physical protection treatments did not support our original hypothesis that the largest Q(10) values would be observed in the treatment with the least physical protection. Compartmentalization of substrates and decomposers is known to reduce the decomposability of otherwise labile material, but the hypothesized attenuation of temperature sensitivity was not detected, and thus the sensitivity is probably driven by the thermodynamics of biochemical reactions as expressed by Arrhenius-type equations.
Resumo:
The relationship between organic matter (OM) lability and temperature sensitivity is disputed, with recent observations suggesting that responses of relatively more resistant OM to increased temperature could be greater than, equivalent to, or less than responses of relatively more labile OM. This lack of clear understanding limits the ability to forecast carbon (C) cycle responses to temperature changes. Here, we derive a novel approach (denoted Q(10-q)) that accounts for changes in OM quality during decomposition and use it to analyze data from three independent sources. Results from new laboratory soil incubations (labile Q(10-q)=2.1 +/- 0.2; more resistant Q(10-q)=3.8 +/- 0.3) and reanalysis of data from other soil incubations reported in the literature (labile Q(10-q)=2.3; more resistant Q(10-q)=3.3) demonstrate that temperature sensitivity of soil OM decomposition increases with decreasing soil OM lability. Analysis of data from a cross-site, field litter bag decomposition study (labile Q(10-q)=3.3 +/- 0.2; resistant Q(10-q)=4.9 +/- 0.2) shows that litter OM follows the same pattern, with greater temperature sensitivity for more resistant litter OM. Furthermore, the initial response of cultivated soils, presumably containing less labile soil OM (Q(10-q)=2.4 +/- 0.3) was greater than that for undisturbed grassland soils (Q(10-q)=1.7 +/- 0.1). Soil C losses estimated using this approach will differ from previous estimates as a function of the magnitude of the temperature increase and the proportion of whole soil OM comprised of compounds sensitive to temperature over that temperature range. It is likely that increased temperature has already prompted release of significant amounts of C to the atmosphere as CO2. Our results indicate that future losses of litter and soil C may be even greater than previously supposed.
Resumo:
The current paradigm in soil organic matter (SOM) dynamics is that the proportion of biologically resistant SOM will increase when total SOM decreases. Recently, several studies have focused on identifying functional pools of resistant SOM consistent with expected behaviours. Our objective was to combine physical and chemical approaches to isolate and quantify biologically resistant SOM by applying acid hydrolysis treatments to physically isolated silt- and clay-sized soil fractions. Microaggegrate-derived and easily dispersed silt- and clay-sized fractions were isolated from surface soil samples collected from six long-term agricultural experiment sites across North America. These fractions were hydrolysed to quantify the non-hydrolysable fraction, which was hypothesized to represent a functional pool of resistant SOM. Organic C and total N concentrations in the four isolated fractions decreased in the order: native > no-till > conventional-till at all sites. Concentrations of non-hydrolysable C (NHC) and N (NHN) were strongly correlated with initial concentrations, and C hydrolysability was found to be invariant with management treatment. Organic C was less hydrolysable than N, and overall, resistance to acid hydrolysis was greater in the silt-sized fractions compared with the clay-sized fractions. The acid hydrolysis results are inconsistent with the current behaviour of increasing recalcitrance with decreasing SOM content: while %NHN was greater in cultivated soils compared with their native analogues, %NHC did not increase with decreasing total organic C concentrations. The analyses revealed an interaction between biochemical and physical protection mechanisms that acts to preserve SOM in fine mineral fractions, but the inconsistency of the pool size with expected behaviour remains to be fully explained.
Impact of soil texture on the distribution of soil organic matter in physical and chemical fractions
Resumo:
Previous research on the protection of soil organic C from decomposition suggests that soil texture affects soil C stocks. However, different pools of soil organic matter (SOM) might be differently related to soil texture. Our objective was to examine how soil texture differentially alters the distribution of organic C within physically and chemically defined pools of unprotected and protected SOM. We collected samples from two soil texture gradients where other variables influencing soil organic C content were held constant. One texture gradient (16-60% clay) was located near Stewart Valley, Saskatchewan, Canada and the other (25-50% clay) near Cygnet, OH. Soils were physically fractionated into coarse- and fine-particulate organic matter (POM), silt- and clay-sized particles within microaggregates, and easily dispersed silt-and clay-sized particles outside of microaggregates. Whole-soil organic C concentration was positively related to silt plus clay content at both sites. We found no relationship between soil texture and unprotected C (coarse- and fine-POM C). Biochemically protected C (nonhydrolyzable C) increased with increasing clay content in whole-soil samples, but the proportion of nonhydrolyzable C within silt- and clay-sized fractions was unchanged. As the amount of silt or clay increased, the amount of C stabilized within easily dispersed and microaggregate-associated silt or clay fractions decreased. Our results suggest that for a given level of C inputs, the relationship between mineral surface area and soil organic matter varies with soil texture for physically and biochemically protected C fractions. Because soil texture acts directly and indirectly on various protection mechanisms, it may not be a universal predictor of whole-soil C content.
Resumo:
The relationship between soil structure and the ability of soil to stabilize soil organic matter (SOM) is a key element in soil C dynamics that has either been overlooked or treated in a cursory fashion when developing SOM models. The purpose of this paper is to review current knowledge of SOM dynamics within the framework of a newly proposed soil C saturation concept. Initially, we distinguish SOM that is protected against decomposition by various mechanisms from that which is not protected from decomposition. Methods of quantification and characteristics of three SOM pools defined as protected are discussed. Soil organic matter can be: (1) physically stabilized, or protected from decomposition, through microaggregation, or (2) intimate association with silt and clay particles, and (3) can be biochemically stabilized through the formation of recalcitrant SOM compounds. In addition to behavior of each SOM pool, we discuss implications of changes in land management on processes by which SOM compounds undergo protection and release. The characteristics and responses to changes in land use or land management are described for the light fraction (LF) and particulate organic matter (POM). We defined the LF and POM not occluded within microaggregates (53-250 mum sized aggregates as unprotected. Our conclusions are illustrated in a new conceptual SOM model that differs from most SOM models in that the model state variables are measurable SOM pools. We suggest that physicochemical characteristics inherent to soils define the maximum protective capacity of these pools, which limits increases in SOM (i.e. C sequestration) with increased organic residue inputs.
Resumo:
Governments around the world are increasingly investing in information and communications technology (ICT) as a means of improving service delivery to citizens. Government ICT adoption is also being driven by a desire to streamline information accessibility and information flows within government - both between different levels of government and between different departments at the same level. Increasing the availability of information internally and to citizens has clear and compelling benefits but it also carries risks that must be carefully managed. This talk will examine the implications of such E-government initiatives for a range of compliance obligations, with a focus on information privacy. It will review recent developments in the area of systems-based enforcement of privacy policies and the particular privacy challenges presented by the aggregation of geospatial information.
Resumo:
Focuses on a study which introduced an iterative modeling method that combines properties of ordinary least squares (OLS) with hierarchical tree-based regression (HTBR) in transportation engineering. Information on OLS and HTBR; Comparison and contrasts of OLS and HTBR; Conclusions.
Resumo:
Measures and theories of information abound, but there are few formalised methods for treating the contextuality that can manifest in different information systems. Quantum theory provides one possible formalism for treating information in context. This paper introduces a quantum-like model of the human mental lexicon, and shows one set of recent experimental data suggesting that concept combinations can indeed behave non-separably. There is some reason to believe that the human mental lexicon displays entanglement.
Resumo:
Architecture for a Free Subjectivity reformulates the French philosopher Gilles Deleuze's model of subjectivity for architecture, by surveying the prolific effects of architectural encounter, and the spaces that figure in them. For Deleuze and his Lacanian collaborator Félix Guattari, subjectivity does not refer to a person, but to the potential for and event of matter becoming subject, and the myriad ways for this to take place. By extension, this book theorizes architecture as a self-actuating or creative agency for the liberation of purely "impersonal effects." Imagine a chemical reaction, a riot in the banlieues, indeed a walk through a city. Simone Brott declares that the architectural object does not merely take part in the production of subjectivity, but that it constitutes its own.
Resumo:
Many developing countries are afflicted by persistent inequality in the distribution of income. While a growing body of literature emphasizes differential fertility as a channel through which income inequality persists, this paper investigates differential child mortality – differences in the incidence of child mortality across socioeconomic groups – as a critical link in this regard. Using evidence from cross-country data to evaluate this linkage, we find that differential child mortality serves as a stronger channel than differential fertility in the transmission of income inequality over time. We use random effects and generalized estimating equations techniques to account for temporal correlation within countries. The results are robust to the use of an alternate definition of fertility that reflects parental preference for children instead of realized fertility.
Resumo:
The need for the development of effective business curricula that meets the needs of the marketplace has created an increase in the adoption of core competencies lists identifying appropriate graduate skills. Many organisations and tertiary institutions have individual graduate capabilities lists including skills deemed essential for success. Skills recognised as ‘critical thinking’ are popular inclusions on core competencies and graduate capability lists. While there is literature outlining ‘critical thinking’ frameworks, methods of teaching it and calls for its integration into business curricula, few studies actually identify quantifiable improvements achieved in this area. This project sought to address the development of ‘critical thinking’ skills in a management degree program by embedding a process for critical thinking within a theory unit undertaken by students early in the program. Focus groups and a student survey were used to identify issues of both content and implementation and to develop a student perspective on their needs in thinking critically. A process utilising a framework of critical thinking was integrated through a workbook of weekly case studies for group analysis, discussions and experiential exercises. The experience included formative and summative assessment. Initial results indicate a greater valuation by students of their experience in the organisation theory unit; better marks for mid semester essay assignments and higher evaluations on the university administered survey of students’ satisfaction.
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile