970 resultados para Offset printing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper investigates the impact of carrier frequency offset (CFO) on Single Carrier wireless communication systems with Frequency Domain Equalization (SC-FDE). We show that CFO in SC-FDE systems causes irrecoverable channel estimation error, which leads to inter-symbol-interference (ISI). The impact of CFO on SC-FDE and OFDM is compared in the presence of CFO and channel estimation errors. Closed form expressions of signal to interference and noise ratio (SINR) are derived for both systems, and verified by simulation results. We find that when channel estimation errors are considered, SC-FDE is similarly or even more sensitive to CFO, compared to OFDM. In particular, in SC-FDE systems, CFO mainly deteriorates the system performance via degrading the channel estimation. Both analytical and simulation results highlight the importance of accurate CFO estimation in SC-FDE systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction Ovine models are widely used in orthopaedic research. To better understand the impact of orthopaedic procedures computer simulations are necessary. 3D finite element (FE) models of bones allow implant designs to be investigated mechanically, thereby reducing mechanical testing. Hypothesis We present the development and validation of an ovine tibia FE model for use in the analysis of tibia fracture fixation plates. Material & Methods Mechanical testing of the tibia consisted of an offset 3-pt bend test with three repetitions of loading to 350N and return to 50N. Tri-axial stacked strain gauges were applied to the anterior and posterior surfaces of the bone and two rigid bodies – consisting of eight infrared active markers, were attached to the ends of the tibia. Positional measurements were taken with a FARO arm 3D digitiser. The FE model was constructed with both geometry and material properties derived from CT images of the bone. The elasticity-density relationship used for material property determination was validated separately using mechanical testing. This model was then transformed to the same coordinate system as the in vitro mechanical test and loads applied. Results Comparison between the mechanical testing and the FE model showed good correlation in surface strains (difference: anterior 2.3%, posterior 3.2%). Discussion & Conclusion This method of model creation provides a simple method for generating subject specific FE models from CT scans. The use of the CT data set for both the geometry and the material properties ensures a more accurate representation of the specific bone. This is reflected in the similarity of the surface strain results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose of review: To examine the relationship between energy intake, appetite control and exercise, with particular reference to longer term exercise studies. This approach is necessary when exploring the benefits of exercise for weight control, as changes in body weight and energy intake are variable and reflect diversity in weight loss. Recent findings: Recent evidence indicates that longer term exercise is characterized by a highly variable response in eating behaviour. Individuals display susceptibility or resistance to exercise-induced weight loss, with changes in energy intake playing a key role in determining the degree of weight loss achieved. Marked differences in hunger and energy intake exist between those who are capable of tolerating periods of exercise-induced energy deficit, and those who are not. Exercise-induced weight loss can increase the orexigenic drive in the fasted state, but for some this is offset by improved postprandial satiety signalling. Summary: The biological and behavioural responses to acute and long-term exercise are highly variable, and these responses interact to determine the propensity for weight change. For some people, long-term exercise stimulates compensatory increases in energy intake that attenuate weight loss. However, favourable changes in body composition and health markers still exist in the absence of weight loss. The physiological mechanisms that confer susceptibility to compensatory overconsumption still need to be determined.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Speaker verification is the process of verifying the identity of a person by analysing their speech. There are several important applications for automatic speaker verification (ASV) technology including suspect identification, tracking terrorists and detecting a person’s presence at a remote location in the surveillance domain, as well as person authentication for phone banking and credit card transactions in the private sector. Telephones and telephony networks provide a natural medium for these applications. The aim of this work is to improve the usefulness of ASV technology for practical applications in the presence of adverse conditions. In a telephony environment, background noise, handset mismatch, channel distortions, room acoustics and restrictions on the available testing and training data are common sources of errors for ASV systems. Two research themes were pursued to overcome these adverse conditions: Modelling mismatch and modelling uncertainty. To directly address the performance degradation incurred through mismatched conditions it was proposed to directly model this mismatch. Feature mapping was evaluated for combating handset mismatch and was extended through the use of a blind clustering algorithm to remove the need for accurate handset labels for the training data. Mismatch modelling was then generalised by explicitly modelling the session conditions as a constrained offset of the speaker model means. This session variability modelling approach enabled the modelling of arbitrary sources of mismatch, including handset type, and halved the error rates in many cases. Methods to model the uncertainty in speaker model estimates and verification scores were developed to address the difficulties of limited training and testing data. The Bayes factor was introduced to account for the uncertainty of the speaker model estimates in testing by applying Bayesian theory to the verification criterion, with improved performance in matched conditions. Modelling the uncertainty in the verification score itself met with significant success. Estimating a confidence interval for the "true" verification score enabled an order of magnitude reduction in the average quantity of speech required to make a confident verification decision based on a threshold. The confidence measures developed in this work may also have significant applications for forensic speaker verification tasks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is a potent agricultural greenhouse gas (GHG). More than 50% of the global anthropogenic N2O flux is attributable to emissions from soil, primarily due to large fertilizer nitrogen (N) applications to corn and other non-leguminous crops. Quantification of the trade–offs between N2O emissions, fertilizer N rate, and crop yield is an essential requirement for informing management strategies aiming to reduce the agricultural sector GHG burden, without compromising productivity and producer livelihood. There is currently great interest in developing and implementing agricultural GHG reduction offset projects for inclusion within carbon offset markets. Nitrous oxide, with a global warming potential (GWP) of 298, is a major target for these endeavours due to the high payback associated with its emission prevention. In this paper we use robust quantitative relationships between fertilizer N rate and N2O emissions, along with a recently developed approach for determining economically profitable N rates for optimized crop yield, to propose a simple, transparent, and robust N2O emission reduction protocol (NERP) for generating agricultural GHG emission reduction credits. This NERP has the advantage of providing an economic and environmental incentive for producers and other stakeholders, necessary requirements in the implementation of agricultural offset projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nitrous oxide (N2O) is a major greenhouse gas (GHG) product of intensive agriculture. Fertilizer nitrogen (N) rate is the best single predictor of N2O emissions in row-crop agriculture in the US Midwest. We use this relationship to propose a transparent, scientifically robust protocol that can be utilized by developers of agricultural offset projects for generating fungible GHG emission reduction credits for the emerging US carbon cap and trade market. By coupling predicted N2O flux with the recently developed maximum return to N (MRTN) approach for determining economically profitable N input rates for optimized crop yield, we provide the basis for incentivizing N2O reductions without affecting yields. The protocol, if widely adopted, could reduce N2O from fertilized row-crop agriculture by more than 50%. Although other management and environmental factors can influence N2O emissions, fertilizer N rate can be viewed as a single unambiguous proxy—a transparent, tangible, and readily manageable commodity. Our protocol addresses baseline establishment, additionality, permanence, variability, and leakage, and provides for producers and other stakeholders the economic and environmental incentives necessary for adoption of agricultural N2O reduction offset projects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work is focussed on developing a commissioning procedure so that a Monte Carlo model, which uses BEAMnrc’s standard VARMLC component module, can be adapted to match a specific BrainLAB m3 micro-multileaf collimator (μMLC). A set of measurements are recommended, for use as a reference against which the model can be tested and optimised. These include radiochromic film measurements of dose from small and offset fields, as well as measurements of μMLC transmission and interleaf leakage. Simulations and measurements to obtain μMLC scatter factors are shown to be insensitive to relevant model parameters and are therefore not recommended, unless the output of the linear accelerator model is in doubt. Ultimately, this note provides detailed instructions for those intending to optimise a VARMLC model to match the dose delivered by their local BrainLAB m3 μMLC device.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Carbon sequestration in agricultural, forest, and grassland soils has been promoted as a means by which substantial amounts of CO2 may be removed from the atmosphere, but few studies have evaluated the associated impacts on changes in soil N or net global warming potential (GWP). The purpose of this research was to ( 1) review the literature to examine how changes in grassland management that affect soil C also impact soil N, ( 2) assess the impact of different types of grassland management on changes in soil N and rates of change, and (3) evaluate changes in N2O fluxes from differently managed grassland ecosystems to assess net impacts on GWP. Soil C and N stocks either both increased or both decreased for most studies. Soil C and N sequestration were tightly linked, resulting in little change in C: N ratios with changes in management. Within grazing treatments N2O made a minor contribution to GWP (0.1-4%), but increases in N2O fluxes offset significant portions of C sequestration gains due to fertilization (10-125%) and conversion (average = 27%). Results from this work demonstrate that even when improved management practices result in considerable rates of C and N sequestration, changes in N2O fluxes can offset a substantial portion of gains by C sequestration. Even for cases in which C sequestration rates are not entirely offset by increases in N2O fluxes, small increases in N2O fluxes can substantially reduce C sequestration benefits. Conversely, reduction of N2O fluxes in grassland soils brought about by changes in management represents an opportunity to reduce the contribution of grasslands to net greenhouse gas forcing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The potential to sequester atmospheric carbon in agricultural and forest soils to offset greenhouse gas emissions has generated interest in measuring changes in soil carbon resulting from changes in land management. However, inherent spatial variability of soil carbon limits the precision of measurement of changes in soil carbon and hence, the ability to detect changes. We analyzed variability of soil carbon by intensively sampling sites under different land management as a step toward developing efficient soil sampling designs. Sites were tilled crop-land and a mixed deciduous forest in Tennessee, and old-growth and second-growth coniferous forest in western Washington, USA. Six soil cores within each of three microplots were taken as an initial sample and an additional six cores were taken to simulate resampling. Soil C variability was greater in Washington than in Tennessee, and greater in less disturbed than in more disturbed sites. Using this protocol, our data suggest that differences on the order of 2.0 Mg C ha(-1) could be detected by collection and analysis of cores from at least five (tilled) or two (forest) microplots in Tennessee. More spatial variability in the forested sites in Washington increased the minimum detectable difference, but these systems, consisting of low C content sandy soil with irregularly distributed pockets of organic C in buried logs, are likely to rank among the most spatially heterogeneous of systems. Our results clearly indicate that consistent intramicroplot differences at all sites will enable detection of much more modest changes if the same microplots are resampled.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is considerable public, political and professional debate about the need for additional hospital beds in Australia. However, there is no clarity in regard to the definition, meaning and significance of hospital bed counts. Relative to population, there has been a total decline in bed availability in Australia over the past 15 years of 14.6% (22.9% for public hospital beds). This decline is partly offset by reductions in length of stay and changes to models of care; however, the net effect is increased bed occupancy which has in turn resulted in system-wide congestion. Future bed capability needs to be better planned to meet growing demands while at the same time continuing trends for more efficient use. Future planning should be based in part on weighted bed capability matched to need.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In November 2009 the researcher embarked on a project aimed at reducing the amount of paper used by Queensland University of Technology (QUT) staff in their daily workplace activities. The key goal was to communicate to staff that excessive printing has a tangible and negative effect on their workplace and local environment. The research objective was to better understand what motivates staff towards more ecologically sustainable printing practises, whilst meeting their job’s demands. The current study is built on previous research that found that one interface does not address the needs of all users when creating persuasive Human Computer Interaction (HCI) interventions targeting resource consumption. In response, the current study created and trialled software that communicates individual paper consumption in precise metrics. Based on preliminary research data different metric sets have been defined to address the different motivations and beliefs of user archetypes using descriptive and injunctive normative information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently published studies not only demonstrated that laser printers are often significant sources of ultrafine particles, but they also shed light on particle formation mechanisms. While the role of fuser roller temperature as a factor affecting particle formation rate has been postulated, its impact has never been quantified. To address this gap in knowledge, this study measured emissions from 30 laser printers in chamber using a standardized printing sequence, as well as monitoring fuser roller temperature. Based on a simplified mass balance equation, the average emission rates of particle number, PM2.5 and O3 were calculated. The results showed that: almost all printers were found to be high particle number emitters (i.e. > 1.01×1010 particles/min); colour printing generated more PM2.5 than monochrome printing; and all printers generated significant amounts of O3. Particle number emissions varied significantly during printing and followed the cycle of fuser roller temperature variation, which points to temperature being the strongest factor controlling emissions. For two sub-groups of printers using the same technology (heating lamps), systematic positive correlations, in the form of a power law, were found between average particle number emission rate and average roller temperature. Other factors, such as fuser material and structure, are also thought to play a role, since no such correlation was found for the remaining two sub-groups of printers using heating lamps, or for the printers using heating strips. In addition, O3 and total PM2.5 were not found to be statistically correlated with fuser temperature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The authors currently engage in two projects to improve human-computer interaction (HCI) designs that can help conserve resources. The projects explore motivation and persuasion strategies relevant to ubiquitous computing systems that bring real-time consumption data into the homes and hands of residents in Brisbane, Australia. The first project seeks to increase understanding among university staff of the tangible and negative effects that excessive printing has on the workplace and local environment. The second project seeks to shift attitudes toward domestic energy conservation through software and hardware that monitor real-time, in situ electricity consumption in homes across Queensland. The insights drawn from these projects will help develop resource consumption user archetypes, providing a framework linking people to differing interface design requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stereolithography is a solid freeform technique (SFF) that was introduced in the late 1980s. Although many other techniques have been developed since then, stereolithography remains one of the most powerful and versatile of all SFF techniques. It has the highest fabrication accuracy and an increasing number of materials that can be processed is becoming available. In this paper we discuss the characteristic features of the stereolithography technique and compare it to other SFF techniques. The biomedical applications of stereolithography are reviewed, as well as the biodegradable resin materials that have been developed for use with stereolithography. Finally, an overview of the application of stereolithography in preparing porous structures for tissue engineering is given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The technologies employed for the preparation of conventional tissue engineering scaffolds restrict the materials choice and the extent to which the architecture can be designed. Here we show the versatility of stereolithography with respect to materials and freedom of design. Porous scaffolds are designed with computer software and built with either a poly(d,l-lactide)-based resin or a poly(d,l-lactide-co-ε-caprolactone)-based resin. Characterisation of the scaffolds by micro-computed tomography shows excellent reproduction of the designs. The mechanical properties are evaluated in compression, and show good agreement with finite element predictions. The mechanical properties of scaffolds can be controlled by the combination of material and scaffold pore architecture. The presented technology and materials enable an accurate preparation of tissue engineering scaffolds with a large freedom of design, and properties ranging from rigid and strong to highly flexible and elastic.