338 resultados para Wu, Yifeng, 1742-1819.
Resumo:
Seven new and three known bisresorcinols, grevirobstol A(=5,5'-((6Z,9Z)-hexadeca-6,9-diene-1,16-diyl)bisresorcinol; 8), 5,5'-[(8Z)-hexadec-8-ene-1,16-diyl]bisresorcinol (9), and 2-methyl-5,5'-[8Z)-hexadec-8-ene-1,16-diyl] bisresorcinol (10) were isolated from the stems of Grevillea glauca. The new compounds were identified on the basis of spectroscopic data as (Z)-6,7-didehydroglaucone A (1), glaucones A and B (2 and 3, resp.), 2-(3-hydroxyisopentyl)bisnorstriatol (4), 2-(3-methylbut-2-en-1-yl)bisnorstriatol (5), 2'-methylgrebustol A (6), and glaucane (7).
Resumo:
This study used automated data processing techniques to calculate a set of novel treatment plan accuracy metrics, and investigate their usefulness as predictors of quality assurance (QA) success and failure. 151 beams from 23 prostate and cranial IMRT treatment plans were used in this study. These plans had been evaluated before treatment using measurements with a diode array system. The TADA software suite was adapted to allow automatic batch calculation of several proposed plan accuracy metrics, including mean field area, small-aperture, off-axis and closed-leaf factors. All of these results were compared the gamma pass rates from the QA measurements and correlations were investigated. The mean field area factor provided a threshold field size (5 cm2, equivalent to a 2.2 x 2.2 cm2 square field), below which all beams failed the QA tests. The small aperture score provided a useful predictor of plan failure, when averaged over all beams, despite being weakly correlated with gamma pass rates for individual beams. By contrast, the closed leaf and off-axis factors provided information about the geometric arrangement of the beam segments but were not useful for distinguishing between plans that passed and failed QA. This study has provided some simple tests for plan accuracy, which may help minimise time spent on QA assessments of treatments that are unlikely to pass.
Resumo:
This study investigates the variation of photon field penumbra shape with initial electron beam diameter, for very narrow beams. A Varian Millenium MLC (Varian Medical Systems, Palo Alto, USA) and a Brainlab m3 microMLC (Brainlab AB. Feldkirchen, Germany) were used, with one Varian iX linear accelerator, to produce fields that were (nominally) 0.20 cm across. Dose profiles for these fields were measured using radiochromic film and compared with the results of simulations completed using BEAMnrc and DOSXYZnrc, where the initial electron beam was set to FWHM = 0.02, 0.10, 0.12, 0.15, 0.20 and 0.50 cm. Increasing the electron-beam FWHM produced increasing occlusion of the photon source by the closely spaced collimator leaves and resulted in blurring of the simulated profile widths from 0.26 to 0.64 cm, for the MLC, from 0.12 to 0.43 cm, for the microMLC. Comparison with measurement data suggested that the electron spot size in the clinical linear accelerator was between FWHM = 0.10 and 0.15 cm, encompassing the result of our previous output-factor based work, which identified a FWHM of 0.12. Investigation of narrow-beam penumbra variation has been found to be a useful procedure, with results varying noticeably with linear accelerator spot size and allowing FWHM estimates obtained using other methods to be verified.
Resumo:
To obtain accurate Monte Carlo simulations of small radiation fields, it is important model the initial source parameters (electron energy and spot size) accurately. However recent studies have shown that small field dosimetry correction factors are insensitive to these parameters. The aim of this work is to extend this concept to test if these parameters affect dose perturbations in general, which is important for detector design and calculating perturbation correction factors. The EGSnrc C++ user code cavity was used for all simulations. Varying amounts of air between 0 and 2 mm were deliberately introduced upstream to a diode and the dose perturbation caused by the air was quantified. These simulations were then repeated using a range of initial electron energies (5.5 to 7.0 MeV) and electron spot sizes (0.7 to 2.2 FWHM). The resultant dose perturbations were large. For example 2 mm of air caused a dose reduction of up to 31% when simulated with a 6 mm field size. However these values did not vary by more than 2 % when simulated across the full range of source parameters tested. If a detector is modified by the introduction of air, one can be confident that the response of the detector will be the same across all similar linear accelerators and the Monte Carlo modelling of each machine is not required.
Resumo:
Reliability of carrier phase ambiguity resolution (AR) of an integer least-squares (ILS) problem depends on ambiguity success rate (ASR), which in practice can be well approximated by the success probability of integer bootstrapping solutions. With the current GPS constellation, sufficiently high ASR of geometry-based model can only be achievable at certain percentage of time. As a result, high reliability of AR cannot be assured by the single constellation. In the event of dual constellations system (DCS), for example, GPS and Beidou, which provide more satellites in view, users can expect significant performance benefits such as AR reliability and high precision positioning solutions. Simply using all the satellites in view for AR and positioning is a straightforward solution, but does not necessarily lead to high reliability as it is hoped. The paper presents an alternative approach that selects a subset of the visible satellites to achieve a higher reliability performance of the AR solutions in a multi-GNSS environment, instead of using all the satellites. Traditionally, satellite selection algorithms are mostly based on the position dilution of precision (PDOP) in order to meet accuracy requirements. In this contribution, some reliability criteria are introduced for GNSS satellite selection, and a novel satellite selection algorithm for reliable ambiguity resolution (SARA) is developed. The SARA algorithm allows receivers to select a subset of satellites for achieving high ASR such as above 0.99. Numerical results from a simulated dual constellation cases show that with the SARA procedure, the percentages of ASR values in excess of 0.99 and the percentages of ratio-test values passing the threshold 3 are both higher than those directly using all satellites in view, particularly in the case of dual-constellation, the percentages of ASRs (>0.99) and ratio-test values (>3) could be as high as 98.0 and 98.5 % respectively, compared to 18.1 and 25.0 % without satellite selection process. It is also worth noting that the implementation of SARA is simple and the computation time is low, which can be applied in most real-time data processing applications.
Resumo:
A new wave energy flow (WEF) map concept was proposed in this work. Based on it, an improved technique incorporating the laser scanning method and Betti’s reciprocal theorem was developed to evaluate the shape and size of damage as well as to realize visualization of wave propagation. In this technique, a simple signal processing algorithm was proposed to construct the WEF map when waves propagate through an inspection region, and multiple lead zirconate titanate (PZT) sensors were employed to improve inspection reliability. Various damages in aluminum and carbon fiber reinforced plastic laminated plates were experimentally and numerically evaluated to validate this technique. The results show that it can effectively evaluate the shape and size of damage from wave field variations around the damage in the WEF map.
Resumo:
Polymer biomaterials have been widely used for bone replacement/regeneration because of their unique mechanical properties and workability. Their inherent low bioactivity makes them lack osseointegration with host bone tissue. For this reason, bioactive inorganic particles have been always incorporated into the matrix of polymers to improve their bioactivity. However, mixing inorganic particles with polymers always results in inhomogeneity of particle distribution in polymer matrix with limited bioactivity. This study sets out to apply the pulsed laser deposition (PLD) technique to prepare uniform akermanite (Ca2MgSi2O7, AKT) glass nanocoatings on the surface of two polymers (non-degradable polysulfone (PSU) and degradable polylactic acid (PDLLA)) in order to improve their surface osteogenic and angiogenic activity. The results show that a uniform nanolayer composed of amorphous AKT particles (∼30nm) of thickness 130nm forms on the surface of both PSU and PDLLA films with the PLD technique. The prepared AKT-PSU and AKT-PDLLA films significantly improved the surface roughness, hydrophilicity, hardness and apatite mineralization, compared with pure PSU and PDLLA, respectively. The prepared AKT nanocoatings distinctively enhance the alkaline phosphate (ALP) activity and bone-related gene expression (ALP, OCN, OPN and Col I) of bone-forming cells on both PSU and PDLLA films. Furthermore, AKT nanocoatings on two polymers improve the attachment, proliferation, VEGF secretion and expression of proangiogenic factors and their receptors of human umbilical vein endothelial cells (HUVEC). The results suggest that PLD-prepared bioceramic nanocoatings are very useful for enhancing the physicochemical, osteogenic and angiogenic properties of both degradable and non-degradable polymers for application in bone replacement/regeneration.
Resumo:
An important responsibility of the Environment Protection Authority, Victoria, is to set objectives for levels of environmental contaminants. To support the development of environmental objectives for water quality, a need has been identified to understand the dual impacts of concentration and duration of a contaminant on biota in freshwater streams. For suspended solids contamination, information reported by Newcombe and Jensen [ North American Journal of Fisheries Management , 16(4):693--727, 1996] study of freshwater fish and the daily suspended solids data from the United States Geological Survey stream monitoring network is utilised. The study group was requested to examine both the utility of the Newcombe and Jensen and the USA data, as well as the formulation of a procedure for use by the Environment Protection Authority Victoria that takes concentration and duration of harmful episodes into account when assessing water quality. The extent to which the impact of a toxic event on fish health could be modelled deterministically was also considered. It was found that concentration and exposure duration were the main compounding factors on the severity of effects of suspended solids on freshwater fish. A protocol for assessing the cumulative effect on fish health and a simple deterministic model, based on the biology of gill harm and recovery, was proposed. References D. W. T. Au, C. A. Pollino, R. S. S Wu, P. K. S. Shin, S. T. F. Lau, and J. Y. M. Tang. Chronic effects of suspended solids on gill structure, osmoregulation, growth, and triiodothyronine in juvenile green grouper epinephelus coioides . Marine Ecology Press Series , 266:255--264, 2004. J.C. Bezdek, S.K. Chuah, and D. Leep. Generalized k-nearest neighbor rules. Fuzzy Sets and Systems , 18:237--26, 1986. E. T. Champagne, K. L. Bett-Garber, A. M. McClung, and C. Bergman. {Sensory characteristics of diverse rice cultivars as influenced by genetic and environmental factors}. Cereal Chem. , {81}:{237--243}, {2004}. S. G. Cheung and P. K. S. Shin. Size effects of suspended particles on gill damage in green-lipped mussel perna viridis. Marine Pollution Bulletin , 51(8--12):801--810, 2005. D. H. Evans. The fish gill: site of action and model for toxic effects of environmental pollutants. Environmental Health Perspectives , 71:44--58, 1987. G. C. Grigg. The failure of oxygen transport in a fish at low levels of ambient oxygen. Comp. Biochem. Physiol. , 29:1253--1257, 1969. G. Holmes, A. Donkin, and I.H. Witten. {Weka: A machine learning workbench}. In Proceedings of the Second Australia and New Zealand Conference on Intelligent Information Systems , volume {24}, pages {357--361}, {Brisbane, Australia}, {1994}. {IEEE Computer Society}. D. D. Macdonald and C. P. Newcombe. Utility of the stress index for predicting suspended sediment effects: response to comments. North American Journal of Fisheries Management , 13:873--876, 1993. C. P. Newcombe. Suspended sediment in aquatic ecosystems: ill effects as a function of concentration and duration of exposure. Technical report, British Columbia Ministry of Environment, Lands and Parks, Habitat Protection branch, Victoria, 1994. C. P. Newcombe and J. O. T. Jensen. Channel suspended sediment and fisheries: A synthesis for quantitative assessment of risk and impact. North American Journal of Fisheries Management , 16(4):693--727, 1996. C. P. Newcombe and D. D. Macdonald. Effects of suspended sediments on aquatic ecosystems. North American Journal of Fisheries Management , 11(1):72--82, 1991. K. Schmidt-Nielsen. Scaling. Why is animal size so important? Cambridge University Press, NY, 1984. J. S. Schwartz, A. Simon, and L. Klimetz. Use of fish functional traits to associate in-stream suspended sediment transport metrics with biological impairment. Environmental Monitoring and Assessment , 179(1--4):347--369, 2011. E. Al Shaw and J. S. Richardson. Direct and indirect effects of sediment pulse duration on stream invertebrate assemb ages and rainbow trout ( Oncorhynchus mykiss ) growth and survival. Canadian Journal of Fish and Aquatic Science , 58:2213--2221, 2001. P. Tiwari and H. Hasegawa. {Demand for housing in Tokyo: A discrete choice analysis}. Regional Studies , {38}:{27--42}, {2004}. Y. Tramblay, A. Saint-Hilaire, T. B. M. J. Ouarda, F. Moatar, and B Hecht. Estimation of local extreme suspended sediment concentrations in california rivers. Science of the Total Environment , 408:4221--
Resumo:
Periodontal disease is characterized by the destruction of the tissues that attach the tooth to the alveolar bone. Various methods for regenerative periodontal therapy including the use of barrier membranes, bone replacement grafts, and growth factor delivery have been investigated; however, true regeneration of periodontal tissue is still a significant challenge to scientists and clinicians. The focus on periodontal tissue engineering has shifted from attempting to recreate tissue replacements/constructs to the development of biomaterials that incorporate and release regulatory signals to achieve in situ periodontal regeneration. The release of ions and molecular cues from biomaterials may help to unlock latent regenerative potential in the body by regulating cell proliferation and differentiation towards different lineages (e.g. osteoblasts and cementoblasts). Silicate-based bioactive materials, including bioactive silicate glasses and ceramics, have become the materials of choice for periodontal regeneration, due to their favourable osteoconductivity and bioactivity. This article will focus on the most recent advances in the in vitro and in vivo biological application of silicate-based ceramics, specifically as it relates to periodontal tissue engineering.
Resumo:
Nuclei and electrons in condensed matter and/or molecules are usually entangled, due to the prevailing (mainly electromagnetic) interactions. However, the "environment" of a microscopic scattering system (e.g. a proton) causes ultrafast decoherence, thus making atomic and/or nuclear entanglement e®ects not directly accessible to experiments. However, our neutron Compton scattering experiments from protons (H-atoms) in condensed systems and molecules have a characteristic collisional time about 100|1000 attoseconds. The quantum dynamics of an atom in this ultrashort, but ¯nite, time window is governed by non-unitary time evolution due to the aforementioned decoherence. Unexpectedly, recent theoretical investigations have shown that decoherence can also have the following energetic consequences. Disentangling two subsystems A and B of a quantum system AB is tantamount to erasure of quantum phase relations between A and B. This erasure is widely believed to be an innocuous process, which e.g. does not a®ect the energies of A and B. However, two independent groups proved recently that disentangling two systems, within a su±ciently short time interval, causes increase of their energies. This is also derivable by the simplest Lindblad-type master equation of one particle being subject to pure decoherence. Our neutron-proton scattering experiments with H2 molecules provide for the first time experimental evidence of this e®ect. Our results reveal that the neutron-proton collision, leading to the cleavage of the H-H bond in the attosecond timescale, is accompanied by larger energy transfer (by about 2|3%) than conventional theory predicts. Preliminary results from current investigations show qualitatively the same e®ect in the neutron-deuteron Compton scattering from D2 molecules. We interpret the experimental findings by treating the neutron-proton (or neutron-deuteron) collisional system as an entangled open quantum system being subject to fast decoherence caused by its "environment" (i.e., two electrons plus second nucleus of H2 or D2). The presented results seem to be of generic nature, and may have considerable consequences for various processes in condensed matter and molecules, e.g. in elementary chemical reactions.
Resumo:
In this paper we describe the design of DNA Jewellery, which is a wearable tangible data representation of personal DNA profile data. An iterative design process was followed to develop a 3D form-language that could be mapped to standard DNA profile data, with the aim of retaining readability of data while also producing an aesthetically pleasing and unique result in the area of personalized design. The work explores design issues with the production of data tangibles, contributes to a growing body of research exploring tangible representations of data and highlights the importance of approaches that move between technology, art and design.
Resumo:
Global climate change is one of the most significant environmental issues that can harm human development. One central issue for the building and construction industry to address global climate change is the development of a credible and meaningful way to measure greenhouse gas (GHG) emissions. While Publicly Available Specification (PAS) 2050, the first international GHG standard, has been proven to be successful in standardizing the quantification process, its contribution to the management of carbon labels for construction materials is limited. With the recent publication of ISO 14067: Greenhouse gases – carbon footprint of products – requirements and guidelines for quantification and communication in May 2013, it is necessary for the building and construction industry to understand the past, present and future of the carbon labelling practices for construction materials. A systematic review shows that international GHG standards have been evolving in terms of providing additional guidance on communication and comparison, as well as less flexibility on the use of carbon labels. At the same time, carbon labelling schemes have been evolving on standardization and benchmarking. In addition, future actions are needed in the aspect of raising consumer awareness, providing benchmarking, ensuring standardization and developing simulation technologies in order for carbon labelling schemes for construction materials to provide credible, accurate and transparent information on GHG emissions.
Resumo:
The design-build (DB) system is regarded as an effective means of delivering sustainable buildings. Specifying clear sustainability requirements to potential contractors is of great importance to project success. This research investigates the current state-of-the-practice for the definition of sustainability requirements within the public sectors of the U.S. construction market using a robust content analysis of 49 DB requests for proposals (RFPs). The results reveal that owners predominantly communicate their desired level of sustainability through the LEED certification system. The sustainability requirement has become an important dimension for the best-value evaluation of DB contractors with specific importance weightings of up to 25%. Additionally, owners of larger projects and who provide less design information in their RFPs generally allocate significantly higher importance weightings to sustainability requirements. The primary knowledge contribution of this study to the construction industry is the reveal of current trend in DB procurement for green projects. The findings also provide owners, architects, engineers, and constructors with an effective means of communicating sustainability objectives in solicitation documents.
Resumo:
Scaffolds are porous biocompatible materials with suitable microarchitectures that are designed to allow for cell adhesion, growth and proliferation. They are used in combination with cells in regenerative medicine to promote tissue regeneration by means of a controlled deposition of natural extracellular matrix by the hosted cells therein. This healing process is in many cases accompanied by scaffold degradation up to its total disappearance when the scaffold is made of a biodegradable material. This work presents a computational model that simulates the degradation of scaffolds. The model works with three-dimensional microstructures, which have been previously discretised into small cubic homogeneous elements, called voxels. The model simulates the evolution of the degradation of the scaffold using a Monte Carlo algorithm, which takes into account the curvature of the surface of the fibres. The simulation results obtained in this study are in good agreement with empirical degradation measurements performed by mass loss on scaffolds after exposure to an etching alkaline solution.
Resumo:
Validation is an important issue in the development and application of Bayesian Belief Network (BBN) models, especially when the outcome of the model cannot be directly observed. Despite this, few frameworks for validating BBNs have been proposed and fewer have been applied to substantive real-world problems. In this paper we adopt the approach by Pitchforth and Mengersen (2013), which includes nine validation tests that each focus on the structure, discretisation, parameterisation and behaviour of the BBNs included in the case study. We describe the process and result of implementing a validation framework on a model of a real airport terminal system with particular reference to its effectiveness in producing a valid model that can be used and understood by operational decision makers. In applying the proposed validation framework we demonstrate the overall validity of the Inbound Passenger Facilitation Model as well as the effectiveness of the validity framework itself.