440 resultados para minimal ontological overlap


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Agricultural management affects soil organic matter, which is important for sustainable crop production and as a greenhouse gas sink. Our objective was to determine how tillage, residue management and N fertilization affect organic C in unprotected, and physically, chemically and biochemically protected soil C pools. Samples from Breton, Alberta were fractionated and analysed for organic C content. As in previous report, N fertilization had a positive effect, tillage had a minimal effect, and straw management had no effect on whole-soil organic C. Tillage and straw management did not alter organic C concentrations in the isolated C pools, while N fertilization increased C concentrations in all pools. Compared with a woodlot soil, the cultivated plots had lower total organic C, and the C was redistributed among isolated pools. The free light fraction and coarse particulate organic matter responded positively to C inputs, suggesting that much of the accumulated organic C occurred in an unprotected pool. The easily dispersed silt-sized fraction was the mineral-associated pool most responsive to changes in C inputs, whereas the microaggregate-derived silt-sized fraction best preserved C upon cultivation. These findings suggest that the silt-sized fraction is important for the long-term stabilization of organic matter through both physical occlusion in microaggregates and chemical protection by mineral association.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Since land use change can have significant impacts on regional biogeochemistry, we investigated how conversion of forest and cultivation to pasture impact soil C and N cycling. In addition to examining total soil C, we isolated soil physiochemical C fractions in order to understand the mechanisms by which soil C is sequestered or lost. Total soil C did not change significantly over time following conversion from forest, though coarse (250-2,000 mum) particulate organic matter C increased by a factor of 6 immediately after conversion. Aggregate mean weight diameter was reduced by about 50% after conversion, but values were like those under forest after 8 years under pasture. Samples collected from a long-term pasture that was converted from annual cultivation more than 50 years ago revealed that some soil physical properties negatively impacted by cultivation were very slow to recover. Finally, our results indicate that soil macroaggregates turn over more rapidly under pasture than under forest and are less efficient at stabilizing soil C, whereas microaggregates from pasture soils stabilize a larger concentration of C than forest microaggregates. Since conversion from forest to pasture has a minimal impact on total soil C content in the Piedmont region of Virginia, United States, a simple C stock accounting system could use the same base soil C stock value for either type of land use. However, since the effects of forest to pasture conversion are a function of grassland management following conversion, assessments of C sequestration rates require activity data on the extent of various grassland management practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In rural low-voltage networks, distribution lines are usually highly resistive. When many distributed generators are connected to such lines, power sharing among them is difficult when using conventional droop control, as the real and reactive power have strong coupling with each other. A high droop gain can alleviate this problem but may lead the system to instability. To overcome4 this, two droop control methods are proposed for accurate load sharing with frequency droop controller. The first method considers no communication among the distributed generators and regulates the output voltage and frequency, ensuring acceptable load sharing. The droop equations are modified with a transformation matrix based on the line R/X ration for this purpose. The second proposed method, with minimal low bandwidth communication, modifies the reference frequency of the distributed generators based on the active and reactive power flow in the lines connected to the points of common coupling. The performance of these two proposed controllers is compared with that of a controller, which includes an expensive high bandwidth communication system through time-domain simulation of a test system. The magnitude of errors in power sharing between these three droop control schemes are evaluated and tabulated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Embedded generalized markup, as applied by digital humanists to the recording and studying of our textual cultural heritage, suffers from a number of serious technical drawbacks. As a result of its evolution from early printer control languages, generalized markup can only express a document’s ‘logical’ structure via a repertoire of permissible printed format structures. In addition to the well-researched overlap problem, the embedding of markup codes into texts that never had them when written leads to a number of further difficulties: the inclusion of potentially obsolescent technical and subjective information into texts that are supposed to be archivable for the long term, the manual encoding of information that could be better computed automatically, and the obscuring of the text by highly complex technical data. Many of these problems can be alleviated by asserting a separation between the versions of which many cultural heritage texts are composed, and their content. In this way the complex inter-connections between versions can be handled automatically, leaving only simple markup for individual versions to be handled by the user.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Consider a person searching electronic health records, a search for the term ‘cracked skull’ should return documents that contain the term ‘cranium fracture’. A information retrieval systems is required that matches concepts, not just keywords. Further more, determining relevance of a query to a document requires inference – its not simply matching concepts. For example a document containing ‘dialysis machine’ should align with a query for ‘kidney disease’. Collectively we describe this problem as the ‘semantic gap’ – the difference between the raw medical data and the way a human interprets it. This paper presents an approach to semantic search of health records by combining two previous approaches: an ontological approach using the SNOMED CT medical ontology; and a distributional approach using semantic space vector space models. Our approach will be applied to a specific problem in health informatics: the matching of electronic patient records to clinical trials.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The processes of digitization and deregulation have transformed the production, distribution and consumption of information and entertainment media over the past three decades. Today, researchers are confronted with profoundly different landscapes of domestic and personal media than the pioneers of qualitative audience research that came to form much of the conceptual basis of Cultural Studies first in Britain and North America and subsequently across all global regions. The process of media convergence, as a consequence of the dual forces of digitisation and deregulation, thus constitutes a central concept in the analysis of popular mass media. From the study of the internationalisation and globalisation of media content, changing regimes of media production, via the social shaping and communication technologies and conversely the impact of communication technology on social, cultural and political realities, to the emergence of transmedia storytelling, the interplay of intertextuality and genre and the formation of mediated social networks, convergence informs and shapes contemporary conceptual debates in the field of popular communication and beyond. However, media convergence challenges not only the conceptual canon of (popular) communication research, but poses profound methodological challenges. As boundaries between producers and consumers are increasingly fluent, formerly stable fields and categories of research such as industries, texts and audiences intersect and overlap, requiring combined and new research strategies. This preconference aims to offer a forum to present and discuss methodological innovations in the study of contemporary media and the analysis of the social, cultural,and political impact and challenges arising through media convergence. The preconference thus aims to focus on the following methodological questions and challenges: *New strategies of audience research responding to the increasing individualisation of popular media consumption. *Methods of data triangulation in and through the integrated study of media production, distribution and consumption. *Bridging the methodological and often associated conceptual gap between qualitative and quantitative research in the study of popular media. *The future of ethnographic audience and production research in light of blurring boundaries between media producers and consumers. *A critical re-examination of which textual configurations can be meaningfully described and studied as text. *Methodological innovations aimed at assessing the macro social, cultural and political impact of mediatization (including, but not limited to, "creative methods"). *Methodological responses to the globalisation of popular media and practicalities of international and transnational comparative research. *An exploration of new methods required in the study of media flow and intertextuality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As organizations reach higher levels of Business Process Management maturity, they tend to collect numerous business process models. Such models may be linked with each other or mutually overlap, supersede one another and evolve over time. Moreover, they may be represented at different abstraction levels depending on the target audience and modeling purpose, and may be available in multiple languages (e.g. due to company mergers). Thus, it is common that organizations struggle with keeping track of their process models. This demonstration introduces AProMoRe (Advanced Process Model Repository) which aims to facilitate the management of (large) process model collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In an age when escalating fuel prices, global warming and world resource depletion are of great concern, sustainable transport practices promise to define a new way of mobility into the future. With its comparatively minimal negative environmental impacts, non reliance on fuels and positive health effects, the simple bicycle ofers significant benefits to humankind. These benefits are evident worldwide where bicycles are successfully endorsed through improved infrastructure, supporting policies, public education and management. In Australia, the national, state and locall governments are introducing measures to improve and support green transport. This is necessary as current bicycle infrastructure is not always sufficient and the longstanding conflict with motorized transport still exists. The aim for the future is to implement sustainable hard and soft bicycle infrastructure globally; the challenges of such a task can be illustrated by the city of Brisbane, Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The use of porous structures as tissue engineering scaffolds imposes demands on structural parameters such as porosity, pore size and interconnectivity. For the structural analysis of porous scaffolds, micro-computed tomography (μCT) is an ideal tool. μCT is a 3D X-ray imaging method that has several advantages over scanning electron microscopy (SEM) and other conventional characterisation techniques: • visualisation in 3D • quantitative results • non-destructiveness • minimal sample preparation

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As organizations reach higher levels of Business Process Management maturity, they tend to accumulate large collections of process models. These repositories may contain thousands of activities and be managed by different stakeholders with varying skills and responsibilities. However, while being of great value, these repositories induce high management costs. Thus, it becomes essential to keep track of the various model versions as they may mutually overlap, supersede one another and evolve over time. We propose an innovative versioning model and associated storage structure, specifically designed to maximize sharing across process model versions, and to automatically handle change propagation. The focal point of this technique is to version single process model fragments, rather than entire process models. Indeed empirical evidence shows that real-life process model repositories have numerous duplicate fragments. Experiments on two industrial datasets confirm the usefulness of our technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Using sculpture and drawing as my primary methods of investigation, this research explores ways of shifting the emphasis of my creative visual arts practice from object to process whilst still maintaining a primacy of material outcomes. My motivation was to locate ways of developing a sustained practice shaped as much by new works, as by a creative flow between works. I imagined a practice where a logic of structure within discrete forms and a logic of the broader practice might be developed as mutually informed processes. Using basic structural components of multiple wooden curves and linear modes of deployment – in both sculptures and drawings – I have identified both emergence theory and the image of rhizomic growth (Deleuze and Guattari, 1987) as theoretically integral to this imagining of a creative practice, both in terms of critiquing and developing works. Whilst I adopt a formalist approach for this exegesis, the emergence and rhizome models allow it to work as a critique of movement, of becoming and changing, rather than merely a formalism of static structure. In these models, therefore, I have identified a formal approach that can be applied not only to objects, but to practice over time. The thorough reading and application of these ontological models (emergence and rhizome) to visual arts practice, in terms of processes, objects and changes, is the primary contribution of this thesis. The works that form the major component of the research develop, reflect and embody these notions of movement and change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Network has emerged from a contempory worldwide phenomenon, culturally manifested as a consequence of globalization and the knowledge economy. It is in this context that the internet revolution has prompted a radical re-ordering of social and institutional relations and the associated structures, processes and places which support them. Within the duality of virtual space and the augmentation of traditional notions of physical place, the organizational structures pose new challenges for the design professions. Technological developments increasingly permit communication anytime and anywhere, and provide the opportunity for both synchronous and asynchronous collaboration. The resultant ecology formed through the network enterprise has resulted in an often convolted and complex world wherein designers are forced to consider the relevance and meaning of this new context. The role of technology and that of space are thus interwined in the relation between the network and the individual workplace. This paper explores a way to inform the interior desgn process for contemporary workplace environments. It reports on both theoretical and practical outcomes through an Australia-wide case study of three collaborating, yet independent business entities. It further suggests the link between workplace design and successful business innovation being realized between partnering organizations in Great Britain. Evidence presented indicates that, for architects and interior designers, the scope of the problem has widened, the depth of knowledge required to provide solutions has increased, and the rules of engagement are required to change. The ontological and epistemological positions adopted in the study enabled the spatial dimensions to be examined from both within and beyond the confines of a traditional design only viewpoint. Importantly it highlights the significance of a trans-disiplinary collaboration in dealing with the multiple layers and complexity of the contemporary social and business world, from both a research and practice perspective.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Particulate pollution has been widely recognised as an important risk factor to human health. In addition to increases in respiratory and cardiovascular morbidity associated with exposure to particulate matter (PM), WHO estimates that urban PM causes 0.8 million premature deaths globally and that 1.5 million people die prematurely from exposure to indoor smoke generated from the combustion of solid fuels. Despite the availability of a huge body of research, the underlying toxicological mechanisms by which particles induce adverse health effects are not yet entirely understood. Oxidative stress caused by generation of free radicals and related reactive oxygen species (ROS) at the sites of deposition has been proposed as a mechanism for many of the adverse health outcomes associated with exposure to PM. In addition to particle-induced generation of ROS in lung tissue cells, several recent studies have shown that particles may also contain ROS. As such, they present a direct cause of oxidative stress and related adverse health effects. Cellular responses to oxidative stress have been widely investigated using various cell exposure assays. However, for a rapid screening of the oxidative potential of PM, less time-consuming and less expensive, cell-free assays are needed. The main aim of this research project was to investigate the application of a novel profluorescent nitroxide probe, synthesised at QUT, as a rapid screening assay in assessing the oxidative potential of PM. Considering that this was the first time that a profluorescent nitroxide probe was applied in investigating the oxidative stress potential of PM, the proof of concept regarding the detection of PM–derived ROS by using such probes needed to be demonstrated and a sampling methodology needed to be developed. Sampling through an impinger containing profluorescent nitroxide solution was chosen as a means of particle collection as it allowed particles to react with the profluorescent nitroxide probe during sampling, avoiding in that way any possible chemical changes resulting from delays between the sampling and the analysis of the PM. Among several profluorescent nitroxide probes available at QUT, bis(phenylethynyl)anthracene-nitroxide (BPEAnit) was found to be the most suitable probe, mainly due to relatively long excitation and emission wavelengths (λex= 430 nm; λem= 485 and 513 nm). These wavelengths are long enough to avoid overlap with the background fluorescence coming from light absorbing compounds which may be present in PM (e.g. polycyclic aromatic hydrocarbons and their derivatives). Given that combustion, in general, is one of the major sources of ambient PM, this project aimed at getting an insight into the oxidative stress potential of combustion-generated PM, namely cigarette smoke, diesel exhaust and wood smoke PM. During the course of this research project, it was demonstrated that the BPEAnit probe based assay is sufficiently sensitive and robust enough to be applied as a rapid screening test for PM-derived ROS detection. Considering that for all three aerosol sources (i.e. cigarette smoke, diesel exhaust and wood smoke) the same assay was applied, the results presented in this thesis allow direct comparison of the oxidative potential measured for all three sources of PM. In summary, it was found that there was a substantial difference between the amounts of ROS per unit of PM mass (ROS concentration) for particles emitted by different combustion sources. For example, particles from cigarette smoke were found to have up to 80 times less ROS per unit of mass than particles produced during logwood combustion. For both diesel and wood combustion it has been demonstrated that the type of fuel significantly affects the oxidative potential of the particles emitted. Similarly, the operating conditions of the combustion source were also found to affect the oxidative potential of particulate emissions. Moreover, this project has demonstrated a strong link between semivolatile (i.e. organic) species and ROS and therefore, clearly highlights the importance of semivolatile species in particle-induced toxicity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Impedance cardiography is an application of bioimpedance analysis primarily used in a research setting to determine cardiac output. It is a non invasive technique that measures the change in the impedance of the thorax which is attributed to the ejection of a volume of blood from the heart. The cardiac output is calculated from the measured impedance using the parallel conductor theory and a constant value for the resistivity of blood. However, the resistivity of blood has been shown to be velocity dependent due to changes in the orientation of red blood cells induced by changing shear forces during flow. The overall goal of this thesis was to study the effect that flow deviations have on the electrical impedance of blood, both experimentally and theoretically, and to apply the results to a clinical setting. The resistivity of stationary blood is isotropic as the red blood cells are randomly orientated due to Brownian motion. In the case of blood flowing through rigid tubes, the resistivity is anisotropic due to the biconcave discoidal shape and orientation of the cells. The generation of shear forces across the width of the tube during flow causes the cells to align with the minimal cross sectional area facing the direction of flow. This is in order to minimise the shear stress experienced by the cells. This in turn results in a larger cross sectional area of plasma and a reduction in the resistivity of the blood as the flow increases. Understanding the contribution of this effect on the thoracic impedance change is a vital step in achieving clinical acceptance of impedance cardiography. Published literature investigates the resistivity variations for constant blood flow. In this case, the shear forces are constant and the impedance remains constant during flow at a magnitude which is less than that for stationary blood. The research presented in this thesis, however, investigates the variations in resistivity of blood during pulsataile flow through rigid tubes and the relationship between impedance, velocity and acceleration. Using rigid tubes isolates the impedance change to variations associated with changes in cell orientation only. The implications of red blood cell orientation changes for clinical impedance cardiography were also explored. This was achieved through measurement and analysis of the experimental impedance of pulsatile blood flowing through rigid tubes in a mock circulatory system. A novel theoretical model including cell orientation dynamics was developed for the impedance of pulsatile blood through rigid tubes. The impedance of flowing blood was theoretically calculated using analytical methods for flow through straight tubes and the numerical Lattice Boltzmann method for flow through complex geometries such as aortic valve stenosis. The result of the analytical theoretical model was compared to the experimental impedance measurements through rigid tubes. The impedance calculated for flow through a stenosis using the Lattice Boltzmann method provides results for comparison with impedance cardiography measurements collected as part of a pilot clinical trial to assess the suitability of using bioimpedance techniques to assess the presence of aortic stenosis. The experimental and theoretical impedance of blood was shown to inversely follow the blood velocity during pulsatile flow with a correlation of -0.72 and -0.74 respectively. The results for both the experimental and theoretical investigations demonstrate that the acceleration of the blood is an important factor in determining the impedance, in addition to the velocity. During acceleration, the relationship between impedance and velocity is linear (r2 = 0.98, experimental and r2 = 0.94, theoretical). The relationship between the impedance and velocity during the deceleration phase is characterised by a time decay constant, ô , ranging from 10 to 50 s. The high level of agreement between the experimental and theoretically modelled impedance demonstrates the accuracy of the model developed here. An increase in the haematocrit of the blood resulted in an increase in the magnitude of the impedance change due to changes in the orientation of red blood cells. The time decay constant was shown to decrease linearly with the haematocrit for both experimental and theoretical results, although the slope of this decrease was larger in the experimental case. The radius of the tube influences the experimental and theoretical impedance given the same velocity of flow. However, when the velocity was divided by the radius of the tube (labelled the reduced average velocity) the impedance response was the same for two experimental tubes with equivalent reduced average velocity but with different radii. The temperature of the blood was also shown to affect the impedance with the impedance decreasing as the temperature increased. These results are the first published for the impedance of pulsatile blood. The experimental impedance change measured orthogonal to the direction of flow is in the opposite direction to that measured in the direction of flow. These results indicate that the impedance of blood flowing through rigid cylindrical tubes is axisymmetric along the radius. This has not previously been verified experimentally. Time frequency analysis of the experimental results demonstrated that the measured impedance contains the same frequency components occuring at the same time point in the cycle as the velocity signal contains. This suggests that the impedance contains many of the fluctuations of the velocity signal. Application of a theoretical steady flow model to pulsatile flow presented here has verified that the steady flow model is not adequate in calculating the impedance of pulsatile blood flow. The success of the new theoretical model over the steady flow model demonstrates that the velocity profile is important in determining the impedance of pulsatile blood. The clinical application of the impedance of blood flow through a stenosis was theoretically modelled using the Lattice Boltzman method (LBM) for fluid flow through complex geometeries. The impedance of blood exiting a narrow orifice was calculated for varying degrees of stenosis. Clincial impedance cardiography measurements were also recorded for both aortic valvular stenosis patients (n = 4) and control subjects (n = 4) with structurally normal hearts. This pilot trial was used to corroborate the results of the LBM. Results from both investigations showed that the decay time constant for impedance has potential in the assessment of aortic valve stenosis. In the theoretically modelled case (LBM results), the decay time constant increased with an increase in the degree of stenosis. The clinical results also showed a statistically significant difference in time decay constant between control and test subjects (P = 0.03). The time decay constant calculated for test subjects (ô = 180 - 250 s) is consistently larger than that determined for control subjects (ô = 50 - 130 s). This difference is thought to be due to difference in the orientation response of the cells as blood flows through the stenosis. Such a non-invasive technique using the time decay constant for screening of aortic stenosis provides additional information to that currently given by impedance cardiography techniques and improves the value of the device to practitioners. However, the results still need to be verified in a larger study. While impedance cardiography has not been widely adopted clinically, it is research such as this that will enable future acceptance of the method.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose. The objective of this study was to explore the discriminative capacity of non-contact corneal esthesiometry (NCCE) when compared with the neuropathy disability score (NDS) score—a validated, standard method of diagnosing clinically significant diabetic neuropathy. Methods. Eighty-one participants with type 2 diabetes, no history of ocular disease, trauma, or surgery and no history of systemic disease that may affect the cornea were enrolled. Participants were ineligible if there was history of neuropathy due to non-diabetic cause or current diabetic foot ulcer or infection. Corneal sensitivity threshold was measured on the eye of dominant hand side at a distance of 10 mm from the center of the cornea using a stimulus duration of 0.9 s. The NDS was measured producing a score ranging from 0 to 10. To determine the optimal cutoff point of corneal sensitivity that identified the presence of neuropathy (diagnosed by NDS), the Youden index and “closest-to-(0,1)” criteria were used. Results. The receiver-operator characteristic curve for NCCE for the presence of neuropathy (NDS ≥3) had an area under the curve of 0.73 (p = 0.001) and, for the presence of moderate neuropathy (NDS ≥6), area of 0.71 (p = 0.003). By using the Youden index, for an NDS ≥3, the sensitivity of NCCE was 70% and specificity was 75%, and a corneal sensitivity threshold of 0.66 mbar or higher indicated the presence of neuropathy. When NDS ≥6 (indicating risk of foot ulceration) was applied, the sensitivity was 52% with a specificity of 85%. Conclusions. NCCE is a sensitive test for the diagnosis of minimal and more advanced diabetic neuropathy and may serve as a useful surrogate marker for diabetic and perhaps other neuropathies.