907 resultados para minimal ontological overlap
Resumo:
Nitrous oxide (N2O) is primarily produced by the microbially-mediated nitrification and denitrification processes in soils. It is influenced by a suite of climate (i.e. temperature and rainfall) and soil (physical and chemical) variables, interacting soil and plant nitrogen (N) transformations (either competing or supplying substrates) as well as land management practices. It is not surprising that N2O emissions are highly variable both spatially and temporally. Computer simulation models, which can integrate all of these variables, are required for the complex task of providing quantitative determinations of N2O emissions. Numerous simulation models have been developed to predict N2O production. Each model has its own philosophy in constructing simulation components as well as performance strengths. The models range from those that attempt to comprehensively simulate all soil processes to more empirical approaches requiring minimal input data. These N2O simulation models can be classified into three categories: laboratory, field and regional/global levels. Process-based field-scale N2O simulation models, which simulate whole agroecosystems and can be used to develop N2O mitigation measures, are the most widely used. The current challenge is how to scale up the relatively more robust field-scale model to catchment, regional and national scales. This paper reviews the development history, main construction components, strengths, limitations and applications of N2O emissions models, which have been published in the literature. The three scale levels are considered and the current knowledge gaps and challenges in modelling N2O emissions from soils are discussed.
Resumo:
With the increase in the level of global warming, renewable energy based distributed generators (DGs) will increasingly play a dominant role in electricity production. Distributed generation based on solar energy (photovoltaic and solar thermal), wind, biomass, mini-hydro along with use of fuel cells and micro turbines will gain considerable momentum in the near future. A microgrid consists of clusters of load and distributed generators that operate as a single controllable system. The interconnection of the DG to the utility/grid through power electronic converters has raised concern about safe operation and protection of the equipments. Many innovative control techniques have been used for enhancing the stability of microgrid as for proper load sharing. The most common method is the use of droop characteristics for decentralized load sharing. Parallel converters have been controlled to deliver desired real power (and reactive power) to the system. Local signals are used as feedback to control converters, since in a real system, the distance between the converters may make the inter-communication impractical. The real and reactive power sharing can be achieved by controlling two independent quantities, frequency and fundamental voltage magnitude. In this thesis, an angle droop controller is proposed to share power amongst converter interfaced DGs in a microgrid. As the angle of the output voltage can be changed instantaneously in a voltage source converter (VSC), controlling the angle to control the real power is always beneficial for quick attainment of steady state. Thus in converter based DGs, load sharing can be performed by drooping the converter output voltage magnitude and its angle instead of frequency. The angle control results in much lesser frequency variation compared to that with frequency droop. An enhanced frequency droop controller is proposed for better dynamic response and smooth transition between grid connected and islanded modes of operation. A modular controller structure with modified control loop is proposed for better load sharing between the parallel connected converters in a distributed generation system. Moreover, a method for smooth transition between grid connected and islanded modes is proposed. Power quality enhanced operation of a microgrid in presence of unbalanced and non-linear loads is also addressed in which the DGs act as compensators. The compensator can perform load balancing, harmonic compensation and reactive power control while supplying real power to the grid A frequency and voltage isolation technique between microgrid and utility is proposed by using a back-to-back converter. As utility and microgrid are totally isolated, the voltage or frequency fluctuations in the utility side do not affect the microgrid loads and vice versa. Another advantage of this scheme is that a bidirectional regulated power flow can be achieved by the back-to-back converter structure. For accurate load sharing, the droop gains have to be high, which has the potential of making the system unstable. Therefore the choice of droop gains is often a tradeoff between power sharing and stability. To improve this situation, a supplementary droop controller is proposed. A small signal model of the system is developed, based on which the parameters of the supplementary controller are designed. Two methods are proposed for load sharing in an autonomous microgrid in rural network with high R/X ratio lines. The first method proposes power sharing without any communication between the DGs. The feedback quantities and the gain matrixes are transformed with a transformation matrix based on the line R/X ratio. The second method involves minimal communication among the DGs. The converter output voltage angle reference is modified based on the active and reactive power flow in the line connected at point of common coupling (PCC). It is shown that a more economical and proper power sharing solution is possible with the web based communication of the power flow quantities. All the proposed methods are verified through PSCAD simulations. The converters are modeled with IGBT switches and anti parallel diodes with associated snubber circuits. All the rotating machines are modeled in detail including their dynamics.
Resumo:
Agricultural management affects soil organic matter, which is important for sustainable crop production and as a greenhouse gas sink. Our objective was to determine how tillage, residue management and N fertilization affect organic C in unprotected, and physically, chemically and biochemically protected soil C pools. Samples from Breton, Alberta were fractionated and analysed for organic C content. As in previous report, N fertilization had a positive effect, tillage had a minimal effect, and straw management had no effect on whole-soil organic C. Tillage and straw management did not alter organic C concentrations in the isolated C pools, while N fertilization increased C concentrations in all pools. Compared with a woodlot soil, the cultivated plots had lower total organic C, and the C was redistributed among isolated pools. The free light fraction and coarse particulate organic matter responded positively to C inputs, suggesting that much of the accumulated organic C occurred in an unprotected pool. The easily dispersed silt-sized fraction was the mineral-associated pool most responsive to changes in C inputs, whereas the microaggregate-derived silt-sized fraction best preserved C upon cultivation. These findings suggest that the silt-sized fraction is important for the long-term stabilization of organic matter through both physical occlusion in microaggregates and chemical protection by mineral association.
Resumo:
Since land use change can have significant impacts on regional biogeochemistry, we investigated how conversion of forest and cultivation to pasture impact soil C and N cycling. In addition to examining total soil C, we isolated soil physiochemical C fractions in order to understand the mechanisms by which soil C is sequestered or lost. Total soil C did not change significantly over time following conversion from forest, though coarse (250-2,000 mum) particulate organic matter C increased by a factor of 6 immediately after conversion. Aggregate mean weight diameter was reduced by about 50% after conversion, but values were like those under forest after 8 years under pasture. Samples collected from a long-term pasture that was converted from annual cultivation more than 50 years ago revealed that some soil physical properties negatively impacted by cultivation were very slow to recover. Finally, our results indicate that soil macroaggregates turn over more rapidly under pasture than under forest and are less efficient at stabilizing soil C, whereas microaggregates from pasture soils stabilize a larger concentration of C than forest microaggregates. Since conversion from forest to pasture has a minimal impact on total soil C content in the Piedmont region of Virginia, United States, a simple C stock accounting system could use the same base soil C stock value for either type of land use. However, since the effects of forest to pasture conversion are a function of grassland management following conversion, assessments of C sequestration rates require activity data on the extent of various grassland management practices.
Resumo:
In rural low-voltage networks, distribution lines are usually highly resistive. When many distributed generators are connected to such lines, power sharing among them is difficult when using conventional droop control, as the real and reactive power have strong coupling with each other. A high droop gain can alleviate this problem but may lead the system to instability. To overcome4 this, two droop control methods are proposed for accurate load sharing with frequency droop controller. The first method considers no communication among the distributed generators and regulates the output voltage and frequency, ensuring acceptable load sharing. The droop equations are modified with a transformation matrix based on the line R/X ration for this purpose. The second proposed method, with minimal low bandwidth communication, modifies the reference frequency of the distributed generators based on the active and reactive power flow in the lines connected to the points of common coupling. The performance of these two proposed controllers is compared with that of a controller, which includes an expensive high bandwidth communication system through time-domain simulation of a test system. The magnitude of errors in power sharing between these three droop control schemes are evaluated and tabulated.
Resumo:
Embedded generalized markup, as applied by digital humanists to the recording and studying of our textual cultural heritage, suffers from a number of serious technical drawbacks. As a result of its evolution from early printer control languages, generalized markup can only express a document’s ‘logical’ structure via a repertoire of permissible printed format structures. In addition to the well-researched overlap problem, the embedding of markup codes into texts that never had them when written leads to a number of further difficulties: the inclusion of potentially obsolescent technical and subjective information into texts that are supposed to be archivable for the long term, the manual encoding of information that could be better computed automatically, and the obscuring of the text by highly complex technical data. Many of these problems can be alleviated by asserting a separation between the versions of which many cultural heritage texts are composed, and their content. In this way the complex inter-connections between versions can be handled automatically, leaving only simple markup for individual versions to be handled by the user.
Resumo:
Consider a person searching electronic health records, a search for the term ‘cracked skull’ should return documents that contain the term ‘cranium fracture’. A information retrieval systems is required that matches concepts, not just keywords. Further more, determining relevance of a query to a document requires inference – its not simply matching concepts. For example a document containing ‘dialysis machine’ should align with a query for ‘kidney disease’. Collectively we describe this problem as the ‘semantic gap’ – the difference between the raw medical data and the way a human interprets it. This paper presents an approach to semantic search of health records by combining two previous approaches: an ontological approach using the SNOMED CT medical ontology; and a distributional approach using semantic space vector space models. Our approach will be applied to a specific problem in health informatics: the matching of electronic patient records to clinical trials.
Resumo:
The processes of digitization and deregulation have transformed the production, distribution and consumption of information and entertainment media over the past three decades. Today, researchers are confronted with profoundly different landscapes of domestic and personal media than the pioneers of qualitative audience research that came to form much of the conceptual basis of Cultural Studies first in Britain and North America and subsequently across all global regions. The process of media convergence, as a consequence of the dual forces of digitisation and deregulation, thus constitutes a central concept in the analysis of popular mass media. From the study of the internationalisation and globalisation of media content, changing regimes of media production, via the social shaping and communication technologies and conversely the impact of communication technology on social, cultural and political realities, to the emergence of transmedia storytelling, the interplay of intertextuality and genre and the formation of mediated social networks, convergence informs and shapes contemporary conceptual debates in the field of popular communication and beyond. However, media convergence challenges not only the conceptual canon of (popular) communication research, but poses profound methodological challenges. As boundaries between producers and consumers are increasingly fluent, formerly stable fields and categories of research such as industries, texts and audiences intersect and overlap, requiring combined and new research strategies. This preconference aims to offer a forum to present and discuss methodological innovations in the study of contemporary media and the analysis of the social, cultural,and political impact and challenges arising through media convergence. The preconference thus aims to focus on the following methodological questions and challenges: *New strategies of audience research responding to the increasing individualisation of popular media consumption. *Methods of data triangulation in and through the integrated study of media production, distribution and consumption. *Bridging the methodological and often associated conceptual gap between qualitative and quantitative research in the study of popular media. *The future of ethnographic audience and production research in light of blurring boundaries between media producers and consumers. *A critical re-examination of which textual configurations can be meaningfully described and studied as text. *Methodological innovations aimed at assessing the macro social, cultural and political impact of mediatization (including, but not limited to, "creative methods"). *Methodological responses to the globalisation of popular media and practicalities of international and transnational comparative research. *An exploration of new methods required in the study of media flow and intertextuality.
Resumo:
As organizations reach higher levels of Business Process Management maturity, they tend to collect numerous business process models. Such models may be linked with each other or mutually overlap, supersede one another and evolve over time. Moreover, they may be represented at different abstraction levels depending on the target audience and modeling purpose, and may be available in multiple languages (e.g. due to company mergers). Thus, it is common that organizations struggle with keeping track of their process models. This demonstration introduces AProMoRe (Advanced Process Model Repository) which aims to facilitate the management of (large) process model collections.
Resumo:
In an age when escalating fuel prices, global warming and world resource depletion are of great concern, sustainable transport practices promise to define a new way of mobility into the future. With its comparatively minimal negative environmental impacts, non reliance on fuels and positive health effects, the simple bicycle ofers significant benefits to humankind. These benefits are evident worldwide where bicycles are successfully endorsed through improved infrastructure, supporting policies, public education and management. In Australia, the national, state and locall governments are introducing measures to improve and support green transport. This is necessary as current bicycle infrastructure is not always sufficient and the longstanding conflict with motorized transport still exists. The aim for the future is to implement sustainable hard and soft bicycle infrastructure globally; the challenges of such a task can be illustrated by the city of Brisbane, Australia.
Resumo:
The use of porous structures as tissue engineering scaffolds imposes demands on structural parameters such as porosity, pore size and interconnectivity. For the structural analysis of porous scaffolds, micro-computed tomography (μCT) is an ideal tool. μCT is a 3D X-ray imaging method that has several advantages over scanning electron microscopy (SEM) and other conventional characterisation techniques: • visualisation in 3D • quantitative results • non-destructiveness • minimal sample preparation
Resumo:
As organizations reach higher levels of Business Process Management maturity, they tend to accumulate large collections of process models. These repositories may contain thousands of activities and be managed by different stakeholders with varying skills and responsibilities. However, while being of great value, these repositories induce high management costs. Thus, it becomes essential to keep track of the various model versions as they may mutually overlap, supersede one another and evolve over time. We propose an innovative versioning model and associated storage structure, specifically designed to maximize sharing across process model versions, and to automatically handle change propagation. The focal point of this technique is to version single process model fragments, rather than entire process models. Indeed empirical evidence shows that real-life process model repositories have numerous duplicate fragments. Experiments on two industrial datasets confirm the usefulness of our technique.
Resumo:
Using sculpture and drawing as my primary methods of investigation, this research explores ways of shifting the emphasis of my creative visual arts practice from object to process whilst still maintaining a primacy of material outcomes. My motivation was to locate ways of developing a sustained practice shaped as much by new works, as by a creative flow between works. I imagined a practice where a logic of structure within discrete forms and a logic of the broader practice might be developed as mutually informed processes. Using basic structural components of multiple wooden curves and linear modes of deployment – in both sculptures and drawings – I have identified both emergence theory and the image of rhizomic growth (Deleuze and Guattari, 1987) as theoretically integral to this imagining of a creative practice, both in terms of critiquing and developing works. Whilst I adopt a formalist approach for this exegesis, the emergence and rhizome models allow it to work as a critique of movement, of becoming and changing, rather than merely a formalism of static structure. In these models, therefore, I have identified a formal approach that can be applied not only to objects, but to practice over time. The thorough reading and application of these ontological models (emergence and rhizome) to visual arts practice, in terms of processes, objects and changes, is the primary contribution of this thesis. The works that form the major component of the research develop, reflect and embody these notions of movement and change.
Resumo:
Network has emerged from a contempory worldwide phenomenon, culturally manifested as a consequence of globalization and the knowledge economy. It is in this context that the internet revolution has prompted a radical re-ordering of social and institutional relations and the associated structures, processes and places which support them. Within the duality of virtual space and the augmentation of traditional notions of physical place, the organizational structures pose new challenges for the design professions. Technological developments increasingly permit communication anytime and anywhere, and provide the opportunity for both synchronous and asynchronous collaboration. The resultant ecology formed through the network enterprise has resulted in an often convolted and complex world wherein designers are forced to consider the relevance and meaning of this new context. The role of technology and that of space are thus interwined in the relation between the network and the individual workplace. This paper explores a way to inform the interior desgn process for contemporary workplace environments. It reports on both theoretical and practical outcomes through an Australia-wide case study of three collaborating, yet independent business entities. It further suggests the link between workplace design and successful business innovation being realized between partnering organizations in Great Britain. Evidence presented indicates that, for architects and interior designers, the scope of the problem has widened, the depth of knowledge required to provide solutions has increased, and the rules of engagement are required to change. The ontological and epistemological positions adopted in the study enabled the spatial dimensions to be examined from both within and beyond the confines of a traditional design only viewpoint. Importantly it highlights the significance of a trans-disiplinary collaboration in dealing with the multiple layers and complexity of the contemporary social and business world, from both a research and practice perspective.
Resumo:
Particulate pollution has been widely recognised as an important risk factor to human health. In addition to increases in respiratory and cardiovascular morbidity associated with exposure to particulate matter (PM), WHO estimates that urban PM causes 0.8 million premature deaths globally and that 1.5 million people die prematurely from exposure to indoor smoke generated from the combustion of solid fuels. Despite the availability of a huge body of research, the underlying toxicological mechanisms by which particles induce adverse health effects are not yet entirely understood. Oxidative stress caused by generation of free radicals and related reactive oxygen species (ROS) at the sites of deposition has been proposed as a mechanism for many of the adverse health outcomes associated with exposure to PM. In addition to particle-induced generation of ROS in lung tissue cells, several recent studies have shown that particles may also contain ROS. As such, they present a direct cause of oxidative stress and related adverse health effects. Cellular responses to oxidative stress have been widely investigated using various cell exposure assays. However, for a rapid screening of the oxidative potential of PM, less time-consuming and less expensive, cell-free assays are needed. The main aim of this research project was to investigate the application of a novel profluorescent nitroxide probe, synthesised at QUT, as a rapid screening assay in assessing the oxidative potential of PM. Considering that this was the first time that a profluorescent nitroxide probe was applied in investigating the oxidative stress potential of PM, the proof of concept regarding the detection of PM–derived ROS by using such probes needed to be demonstrated and a sampling methodology needed to be developed. Sampling through an impinger containing profluorescent nitroxide solution was chosen as a means of particle collection as it allowed particles to react with the profluorescent nitroxide probe during sampling, avoiding in that way any possible chemical changes resulting from delays between the sampling and the analysis of the PM. Among several profluorescent nitroxide probes available at QUT, bis(phenylethynyl)anthracene-nitroxide (BPEAnit) was found to be the most suitable probe, mainly due to relatively long excitation and emission wavelengths (λex= 430 nm; λem= 485 and 513 nm). These wavelengths are long enough to avoid overlap with the background fluorescence coming from light absorbing compounds which may be present in PM (e.g. polycyclic aromatic hydrocarbons and their derivatives). Given that combustion, in general, is one of the major sources of ambient PM, this project aimed at getting an insight into the oxidative stress potential of combustion-generated PM, namely cigarette smoke, diesel exhaust and wood smoke PM. During the course of this research project, it was demonstrated that the BPEAnit probe based assay is sufficiently sensitive and robust enough to be applied as a rapid screening test for PM-derived ROS detection. Considering that for all three aerosol sources (i.e. cigarette smoke, diesel exhaust and wood smoke) the same assay was applied, the results presented in this thesis allow direct comparison of the oxidative potential measured for all three sources of PM. In summary, it was found that there was a substantial difference between the amounts of ROS per unit of PM mass (ROS concentration) for particles emitted by different combustion sources. For example, particles from cigarette smoke were found to have up to 80 times less ROS per unit of mass than particles produced during logwood combustion. For both diesel and wood combustion it has been demonstrated that the type of fuel significantly affects the oxidative potential of the particles emitted. Similarly, the operating conditions of the combustion source were also found to affect the oxidative potential of particulate emissions. Moreover, this project has demonstrated a strong link between semivolatile (i.e. organic) species and ROS and therefore, clearly highlights the importance of semivolatile species in particle-induced toxicity.