495 resultados para ENHANCEMENTS
Resumo:
Recent optimizations of NMR spectroscopy have focused their attention on innovations in new hardware, such as novel probes and higher field strengths. Only recently has the potential to enhance the sensitivity of NMR through data acquisition strategies been investigated. This thesis has focused on the practice of enhancing the signal-to-noise ratio (SNR) of NMR using non-uniform sampling (NUS). After first establishing the concept and exact theory of compounding sensitivity enhancements in multiple non-uniformly sampled indirect dimensions, a new result was derived that NUS enhances both SNR and resolution at any given signal evolution time. In contrast, uniform sampling alternately optimizes SNR (t < 1.26T2) or resolution (t~3T2), each at the expense of the other. Experiments were designed and conducted on a plant natural product to explore this behavior of NUS in which the SNR and resolution continue to improve as acquisition time increases. Possible absolute sensitivity improvements of 1.5 and 1.9 are possible in each indirect dimension for matched and 2x biased exponentially decaying sampling densities, respectively, at an acquisition time of ¿T2. Recommendations for breaking into the linear regime of maximum entropy (MaxEnt) are proposed. Furthermore, examination into a novel sinusoidal sampling density resulted in improved line shapes in MaxEnt reconstructions of NUS data and comparable enhancement to a matched exponential sampling density. The Absolute Sample Sensitivity derived and demonstrated here for NUS holds great promise in expanding the adoption of non-uniform sampling.
Resumo:
Biodegradable polymer/clay nanocomposites were prepared withpristine and organically modified montmorillonite in polylactic acid (PLA) and polycaprolactone (PCL) polymer matrices. Nanocomposites were fabricated using extrusion and SSSP to compare the effects of melt-state and solid-state processing on the morphology of the final nanocomposite. Characterization of various material properties was performed on prepared biodegradable polymer/clay nanocomposites to evaluate property enhancements from different clays and/or processing methods.
Performance Tuning Non-Uniform Sampling for Sensitivity Enhancement of Signal-Limited Biological NMR
Resumo:
Non-uniform sampling (NUS) has been established as a route to obtaining true sensitivity enhancements when recording indirect dimensions of decaying signals in the same total experimental time as traditional uniform incrementation of the indirect evolution period. Theory and experiments have shown that NUS can yield up to two-fold improvements in the intrinsic signal-to-noise ratio (SNR) of each dimension, while even conservative protocols can yield 20-40 % improvements in the intrinsic SNR of NMR data. Applications of biological NMR that can benefit from these improvements are emerging, and in this work we develop some practical aspects of applying NUS nD-NMR to studies that approach the traditional detection limit of nD-NMR spectroscopy. Conditions for obtaining high NUS sensitivity enhancements are considered here in the context of enabling H-1,N-15-HSQC experiments on natural abundance protein samples and H-1,C-13-HMBC experiments on a challenging natural product. Through systematic studies we arrive at more precise guidelines to contrast sensitivity enhancements with reduced line shape constraints, and report an alternative sampling density based on a quarter-wave sinusoidal distribution that returns the highest fidelity we have seen to date in line shapes obtained by maximum entropy processing of non-uniformly sampled data.
Resumo:
Nitrogen oxides play a crucial role in the budget of tropospheric ozone (O sub(3)) and the formation of the hydroxyl radical. Anthropogenic activities and boreal wildfires are large sources of emissions in the atmosphere. However, the influence of the transport of these emissions on nitrogen oxides and O sub(3) levels at hemispheric scales is not well understood, in particular due to a lack of nitrogen oxides measurements in remote regions. In order to address these deficiencies, measurements of NO, NO sub(2) and NO sub(y) (total reactive nitrogen oxides) were made in the lower free troposphere (FT) over the central North Atlantic region (Pico Mountain station, 38 degree N 28 degree W, 2.3 km asl) from July 2002 to August 2005. These measurements reveal a well-defined seasonal cycle of nitrogen oxides (NO sub(x) = NO+NO sub(2) and NO sub(y)) in the background central North Atlantic lower FT, with higher mixing ratios during the summertime. Observed NO sub(x) and NO sub(y) levels are consistent with long-range transport of emissions, but with significant removal en-route to the measurement site. Reactive nitrogen largely exists in the form of PAN and HNO sub(3) ( similar to 80-90% of NO sub(y)) all year round. A shift in the composition of NO sub(y) from dominance of PAN to dominance of HNO sub(3) occurs from winter-spring to summer-fall, as a result of changes in temperature and photochemistry over the region. Analysis of the long-range transport of boreal wildfire emissions on nitrogen oxides provides evidence of the very large-scale impacts of boreal wildfires on the tropospheric NO sub(x) and O sub(3) budgets. Boreal wildfire emissions are responsible for significant shifts in the nitrogen oxides distributions toward higher levels during the summer, with medians of NO sub(y) (117-175 pptv) and NO sub(x) (9-30 pptv) greater in the presence of boreal wildfire emissions. Extreme levels of NO sub(x) (up to 150 pptv) and NO sub(y) (up to 1100 pptv) observed in boreal wildfire plumes suggest that decomposition of PAN to NO sub(x) is a significant source of NO sub(x), and imply that O sub(3) formation occurs during transport. Ozone levels are also significantly enhanced in boreal wildfire plumes. However, a complex behavior of O sub(3) is observed in the plumes, which varies from significant to lower O sub(3) production to O sub(3) destruction. Long-range transport of anthropogenic emissions from North America also has a significant influence on the regional NO sub(x) and O sub(3) budgets. Transport of pollution from North America causes significant enhancements on nitrogen oxides year-round. Enhancements of CO, NO sub(y) and NO sub(x) indicate that, consistent with previous studies, more than 95% of the NO sub(x) emitted over the U.S. is removed before and during export out of the U.S. boundary layer. However, about 30% of the NO sub(x) emissions exported out of the U.S. boundary layer remain in the airmasses. Since the lifetime of NO sub(x) is shorter than the transport timescale, PAN decomposition and potentially photolysis of HNO sub(3) provide a supply of NO sub(x) over the central North Atlantic lower FT. Observed Delta O sub(3)/ Delta NO sub(y) and large NO sub(y) levels remaining in the North American plumes suggest potential O sub(3) formation well downwind from North America. Finally, a comparison of the nitrogen oxides measurements with results from the global chemical transport (GCT) model GEOS-Chem identifies differences between the observations and the model. GEOS-Chem reproduces the seasonal variation of nitrogen oxides over the central North Atlantic lower FT, but does not capture the magnitude of the cycles. Improvements in our understanding of nitrogen oxides chemistry in the remote FT and emission sources are necessary for the current GCT models to adequately estimate the impacts of emissions on tropospheric NO sub(x) and the resulting impacts on the O sub(3) budget.
Resumo:
Developers rely on the mechanisms provided by their IDE to browse and navigate a large software system. These mechanisms are usually based purely on a system's static source code. The static perspective, however, is not enough to understand an object-oriented program's behavior, in particular if implemented in a dynamic language. We propose to enhance IDEs with a program's runtime information (eg. message sends and type information) to support program comprehension through precise navigation and informative browsing. To precisely specify the type and amount of runtime data to gather about a system under development, dynamically and on demand, we adopt a technique known as partial behavioral reflection. We implemented navigation and browsing enhancements to an IDE that exploit this runtime information in a prototype called Hermion. We present preliminary validation of our experimental enhanced IDE by asking developers to assess its usefulness to understand an unfamiliar software system.
Resumo:
This paper proposes an extension to the televisionwatching paradigm that permits an end-user to enrich broadcast content. Examples of this enriched content are: virtual edits that allow the order of presentation within the content to be changed or that allow the content to be subsetted; conditional text, graphic or video objects that can be placed to appear within content and triggered by viewer interaction; additional navigation links that can be added to structure how other users view the base content object. The enriched content can be viewed directly within the context of the TV viewing experience. It may also be shared with other users within a distributed peer group. Our architecture is based on a model that allows the original content to remain unaltered, and which respects DRM restrictions on content reuse. The fundamental approach we use is to define an intermediate content enhancement layer that is based on the W3C’s SMIL language. Using a pen-based enhancement interface, end-users can manipulate content that is saved in a home PDR setting. This paper describes our architecture and it provides several examples of how our system handles content enhancement. We also describe a reference implementation for creating and viewing enhancements.
Resumo:
Der Einsatz additiver Fertigungsverfahren ist in den vergangenen Jahren stark angestiegen. Technische Weiterentwicklungen der Maschinen machen den Einsatz dieser Fertigungsverfahren für Industrieanwen-dungen immer attraktiver. In einer Untersuchung am Fraunhofer-Institut für Materialfluss und Logistik IML wurden die Einsatzmöglichkeiten additiver Fertigungsverfahren im Bereich autonomer Regalfahrzeuge analysiert. Die Adaption eines neuartigen Förderfahrzeuges für den Einsatz in Regalanlagen steht hierbei im Fokus der Untersuchung. Diese Analyse stellt die Besonderheiten der additiven Fertigung heraus und vergleicht den Herstellungsprozess mit herkömmlichen Verfahren.
Resumo:
Previous studies have highlighted the severity of detrimental effects for life on earth after an assumed regionally limited nuclear war. These effects are caused by climatic, chemical and radiative changes persisting for up to one decade. However, so far only a very limited number of climate model simulations have been performed, giving rise to the question how realistic previous computations have been. This study uses the coupled chemistry climate model (CCM) SOCOL, which belongs to a different family of CCMs than previously used, to investigate the consequences of such a hypothetical nuclear conflict. In accordance with previous studies, the present work assumes a scenario of a nuclear conflict between India and Pakistan, each applying 50 warheads with an individual blasting power of 15 kt ("Hiroshima size") against the major population centers, resulting in the emission of tiny soot particles, which are generated in the firestorms expected in the aftermath of the detonations. Substantial uncertainties related to the calculation of likely soot emissions, particularly concerning assumptions of target fuel loading and targeting of weapons, have been addressed by simulating several scenarios, with soot emissions ranging from 1 to 12 Tg. Their high absorptivity with respect to solar radiation leads to a rapid self-lofting of the soot particles into the strato- and mesosphere within a few days after emission, where they remain for several years. Consequently, the model suggests earth's surface temperatures to drop by several degrees Celsius due to the shielding of solar irradiance by the soot, indicating a major global cooling. In addition, there is a substantial reduction of precipitation lasting 5 to 10 yr after the conflict, depending on the magnitude of the initial soot release. Extreme cold spells associated with an increase in sea ice formation are found during Northern Hemisphere winter, which expose the continental land masses of North America and Eurasia to a cooling of several degrees. In the stratosphere, the strong heating leads to an acceleration of catalytic ozone loss and, consequently, to enhancements of UV radiation at the ground. In contrast to surface temperature and precipitation changes, which show a linear dependence to the soot burden, there is a saturation effect with respect to stratospheric ozone chemistry. Soot emissions of 5 Tg lead to an ozone column reduction of almost 50% in northern high latitudes, while emitting 12 Tg only increases ozone loss by a further 10%. In summary, this study, though using a different chemistry climate model, corroborates the previous investigations with respect to the atmospheric impacts. In addition to these persistent effects, the present study draws attention to episodically cold phases, which would likely add to the severity of human harm worldwide. The best insurance against such a catastrophic development would be the delegitimization of nuclear weapons.
Resumo:
OBJECTIVE Standard stroke CT protocols start with non-enhanced CT followed by perfusion-CT (PCT) and end with CTA. We aimed to evaluate the influence of the sequence of PCT and CTA on quantitative perfusion parameters, venous contrast enhancement and examination time to save critical time in the therapeutic window in stroke patients. METHODS AND MATERIALS Stroke CT data sets of 85 patients, 47 patients with CTA before PCT (group A) and 38 with CTA after PCT (group B) were retrospectively analyzed by two experienced neuroradiologists. Parameter maps of cerebral blood flow, cerebral blood volume, time to peak and mean transit time and contrast enhancements (arterial and venous) were compared. RESULTS Both readers rated contrast of brain-supplying arteries to be equal in both groups (p=0.55 (intracranial) and p=0.73 (extracranial)) although the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). Quantitative perfusion parameters did not significantly differ between the groups (all p>0.18), while the extent of venous superimposition of the ICA was rated higher in group B (p=0.04). The time to complete the diagnostic CT examination was significantly shorter for group A (p<0.01). CONCLUSION Performing CTA directly after NECT has no significant effect on PCT parameters and avoids venous preloading in CTA, while examination times were significantly shorter.
Resumo:
Desertification research conventionally focuses on the problem – that is, degradation – while neglecting the appraisal of successful conservation practices. Based on the premise that Sustainable Land Management (SLM) experiences are not sufficiently or comprehensively documented, evaluated, and shared, the World Overview of Conservation Approaches and Technologies (WOCAT) initiative (www.wocat.net), in collaboration with FAO’s Land Degradation Assessment in Drylands (LADA) project (www.fao.org/nr/lada/) and the EU’s DESIRE project (http://www.desire-project.eu/), has developed standardised tools and methods for compiling and evaluating the biophysical and socio-economic knowledge available about SLM. The tools allow SLM specialists to share their knowledge and assess the impact of SLM at the local, national, and global levels. As a whole, the WOCAT–LADA–DESIRE methodology comprises tools for documenting, self-evaluating, and assessing the impact of SLM practices, as well as for knowledge sharing and decision support in the field, at the planning level, and in scaling up identified good practices. SLM depends on flexibility and responsiveness to changing complex ecological and socioeconomic causes of land degradation. The WOCAT tools are designed to reflect and capture this capacity of SLM. In order to take account of new challenges and meet emerging needs of WOCAT users, the tools are constantly further developed and adapted. Recent enhancements include tools for improved data analysis (impact and cost/benefit), cross-scale mapping, climate change adaptation and disaster risk management, and easier reporting on SLM best practices to UNCCD and other national and international partners. Moreover, WOCAT has begun to give land users a voice by backing conventional documentation with video clips straight from the field. To promote the scaling up of SLM, WOCAT works with key institutions and partners at the local and national level, for example advisory services and implementation projects. Keywords: Sustainable Land Management (SLM), knowledge management, decision-making, WOCAT–LADA–DESIRE methodology.
Resumo:
Even though RFID technology is currently gaining importance mainly in logistics, usage areas, such as shopping or after-sales enhancements beyond the supply chain are envisioned. Yet, while RFID hits the street it is questioned if it may undermine one’s privacy while providing few customer benefits. Meeting this criticism this paper investigates RFID-enabled information services and the drivers of their usefulness for consumers. The article claims that the more risk one associates with a product the more benefit from RFID-enabled information services is perceived. We show empirically that the nature of product risk provides a useful framework to decide on the types of RFID information services a marketer should offer to create RFID usefulness perceptions and increase technology acceptance.
Resumo:
The MDAH pencil-beam algorithm developed by Hogstrom et al (1981) has been widely used in clinics for electron beam dose calculations for radiotherapy treatment planning. The primary objective of this research was to address several deficiencies of that algorithm and to develop an enhanced version. Two enhancements have been incorporated into the pencil-beam algorithm; one models fluence rather than planar fluence, and the other models the bremsstrahlung dose using measured beam data. Comparisons of the resulting calculated dose distributions with measured dose distributions for several test phantoms have been made. From these results it is concluded (1) that the fluence-based algorithm is more accurate to use for the dose calculation in an inhomogeneous slab phantom, and (2) the fluence-based calculation provides only a limited improvement to the accuracy the calculated dose in the region just downstream of the lateral edge of an inhomogeneity. The source of the latter inaccuracy is believed primarily due to assumptions made in the pencil beam's modeling of the complex phantom or patient geometry.^ A pencil-beam redefinition model was developed for the calculation of electron beam dose distributions in three dimensions. The primary aim of this redefinition model was to solve the dosimetry problem presented by deep inhomogeneities, which was the major deficiency of the enhanced version of the MDAH pencil-beam algorithm. The pencil-beam redefinition model is based on the theory of electron transport by redefining the pencil beams at each layer of the medium. The unique approach of this model is that all the physical parameters of a given pencil beam are characterized for multiple energy bins. Comparisons of the calculated dose distributions with measured dose distributions for a homogeneous water phantom and for phantoms with deep inhomogeneities have been made. From these results it is concluded that the redefinition algorithm is superior to the conventional, fluence-based, pencil-beam algorithm, especially in predicting the dose distribution downstream of a local inhomogeneity. The accuracy of this algorithm appears sufficient for clinical use, and the algorithm is structured for future expansion of the physical model if required for site specific treatment planning problems. ^
Resumo:
The evolution of pharmaceutical care is identified through a complete review of the literature published in the American Journal of Health-System Pharmacy, the sole comprehensive publication of institutional pharmacy practice. The evolution is categorized according to characteristics of structure (organizational structure, the role of the pharmacist), process (drug delivery systems, formulary management, acquiring drug products, methods to impact drug therapy decisions), and outcomes (cost of drug delivery, cost of drug acquisition and use, improved safety, improved health outcomes) recorded from the 1950s through the 1990s. While significant progress has been made in implementing basic drug distribution systems, levels of pharmacy involvement with direct patient care is still limited.^ A new practice framework suggests enhanced direct patient care involvement through increase in the efficiency and effectiveness of traditional pharmacy services. Recommendations advance internal and external organizational structure relationships that position pharmacists to fully use their unique skills and knowledge to impact drug therapy decisions and outcomes. Specific strategies facilitate expansion of the breadth and scope of each process component in order to expand the depth of integration of pharmacy and pharmaceutical care within the broad healthcare environment. Economic evaluation methods formally evaluate the impact of both operational and clinical interventions.^ Outcome measurements include specific recommendations and methods to increase efficiency of drug acquisition, emphasizing pharmacists' roles that impact physician prescribing decisions. Effectiveness measures include those that improve safety of drug distribution systems, decrease the potential of adverse drug therapy events, and demonstrate that pharmaceutical care can significantly contribute to improvement in overall health status.^ The implementation of the new framework is modeled on a case study at the M.D. Anderson Cancer Center. The implementation of several new drug distribution methods facilitated the redeployment of personnel from distributive functions to direct patient care activities with significant personnel and drug cost reduction. A cost-benefit analysis illustrates that framework process enhancements produced a benefit-to-cost ratio of 7.9. In addition, measures of effectiveness demonstrated significant levels of safety and enhanced drug therapy outcomes. ^
Resumo:
The aim of this work was to clarify the mechanism taking place in field-enhanced sample injection coupled to sweeping and micellar EKC (FESI-Sweep-MEKC), with the utilization of two acidic high-conductivity buffers (HCBs), phosphoric acid or sodium phosphate buffer, in view of maximizing sensitivity enhancements. Using cationic model compounds in acidic media, a chemometric approach and simulations with SIMUL5 were implemented. Experimental design first enabled to identify the significant factors and their potential interactions. Simulation demonstrates the formation of moving boundaries during sample injection, which originate at the initial sample/HCB and HCB/buffer discontinuities and gradually change the compositions of HCB and BGE. With sodium phosphate buffer, the HCB conductivity increased during the injection, leading to a more efficient preconcentration by staking (about 1.6 times) than with phosphoric acid alone, for which conductivity decreased during injection. For the same injection time at constant voltage, however, a lower amount of analytes was injected with sodium phosphate buffer than with phosphoric acid. Consequently sensitivity enhancements were lower for the whole FESI-Sweep-MEKC process. This is why, in order to maximize sensitivity enhancements, it is proposed to work with sodium phosphate buffer as HCB and to use constant current during sample injection.
Resumo:
Radiocarbon (14C) analysis is a unique tool to distinguish fossil/nonfossil sources of carbonaceous aerosols. We present 14C measurements of organic carbon (OC) and total carbon (TC) on highly time resolved filters (3–4 h, typically 12 h or longer have been reported) from 7 days collected during California Research at the Nexus of Air Quality and Climate Change (CalNex) 2010 in Pasadena. Average nonfossil contributions of 58% ± 15% and 51% ± 15% were found for OC and TC, respectively. Results indicate that nonfossil carbon is a major constituent of the background aerosol, evidenced by its nearly constant concentration (2–3 μgC m−3). Cooking is estimated to contribute at least 25% to nonfossil OC, underlining the importance of urban nonfossil OC sources. In contrast, fossil OC concentrations have prominent and consistent diurnal profiles, with significant afternoon enhancements (~3 μgC m−3), following the arrival of the western Los Angeles (LA) basin plume with the sea breeze. A corresponding increase in semivolatile oxygenated OC and organic vehicular emission markers and their photochemical reaction products occurs. This suggests that the increasing OC is mostly from fresh anthropogenic secondary OC (SOC) from mainly fossil precursors formed in the western LA basin plume. We note that in several European cities where the diesel passenger car fraction is higher, SOC is 20% less fossil, despite 2–3 times higher elemental carbon concentrations, suggesting that SOC formation from gasoline emissions most likely dominates over diesel in the LA basin. This would have significant implications for our understanding of the on-road vehicle contribution to ambient aerosols and merits further study.