45 resultados para Entropy of a sampling design
Resumo:
An environment has been created for the optimisation of aerofoil profiles with inclusion of small surface features. For TS wave dominated flows, the paper examines the consequences of the addition of a depression on the aerodynamic optimisation of an NLF aerofoil, and describes the geometry definition fidelity and optimisation algorithm employed in the development process. The variables that define the depression for this optimisation investigation have been fixed, however a preliminary study is presented demonstrating the sensitivity of the flow to the depression characteristics. Solutions to the optimisation problem are then presented using both gradient-based and genetic algorithm techniques, and for accurate representation of the inclusion of small surface perturbations it is concluded that a global optimisation method is required for this type of aerofoil optimisation task due to the nature of the response surface generated. When dealing with surface features, changes in the transition onset are likely to be of a non-linear nature so it is highly critical to have an optimisation algorithm that is robust, suggesting that for this framework, gradient-based methods alone are not suited.
Resumo:
Wavelet entropy assesses the degree of order or disorder in signals and presents this complex information in a simple metric. Relative wavelet entropy assesses the similarity between the spectral distributions of two signals, again in a simple metric. Wavelet entropy is therefore potentially a very attractive tool for waveform analysis. The ability of this method to track the effects of pharmacologic modulation of vascular function on Doppler blood velocity waveforms was assessed. Waveforms were captured from ophthalmic arteries of 10 healthy subjects at baseline, after the administration of glyceryl trinitrate (GTN) and after two doses of N(G)-nitro-L-arginine-methyl ester (L-NAME) to produce vasodilation and vasoconstriction, respectively. Wavelet entropy had a tendency to decrease from baseline in response to GTN, but significantly increased after the administration of L-NAME (mean: 1.60 ± 0.07 after 0.25 mg/kg and 1.72 ± 0.13 after 0.5 mg/kg vs. 1.50 ± 0.10 at baseline, p < 0.05). Relative wavelet entropy had a spectral distribution from increasing doses of L-NAME comparable to baseline, 0.07 ± 0.04 and 0.08 ± 0.03, respectively, whereas GTN had the most dissimilar spectral distribution compared with baseline (0.17 ± 0.08, p = 0.002). Wavelet entropy can detect subtle changes in Doppler blood velocity waveform structure in response to nitric-oxide-mediated changes in arteriolar smooth muscle tone.
Resumo:
The purpose of this paper is to explore the current design decision making process of selected foreign international non governmental organisations (INGO’s) operating in the field of housing and post disaster housing design and delivery in developing countries. The study forms part of a wider on-going study relation to a decision making in relation to affordable and sustainable housing in developing
countries. The paper highlights the main challenges and opportunities in relation to the design and delivery of low cost sustainable housing in developing countries as identified in current literature on the subject. Interviews and case studies with INGO’s highlight any specific challenges faced by foreign INGO’s operating in a developing country. The preliminary results of this research study provide a concise insight into the design decision making process of leading foreign INGO’s operating in developing countries and will be beneficial to policy makers, NGOs, government bodies and community organisations in practice as it offers unique evidence based insights into international bodies housing design decision making process.
Resumo:
The results of a study aimed at determining the most important experimental parameters for automated, quantitative analysis of solid dosage form pharmaceuticals (seized and model 'ecstasy' tablets) are reported. Data obtained with a macro-Raman spectrometer were complemented by micro-Raman measurements, which gave information on particle size and provided excellent data for developing statistical models of the sampling errors associated with collecting data as a series of grid points on the tablets' surface. Spectra recorded at single points on the surface of seized MDMA-caffeine-lactose tablets with a Raman microscope (lambda(ex) = 785 nm, 3 mum diameter spot) were typically dominated by one or other of the three components, consistent with Raman mapping data which showed the drug and caffeine microcrystals were ca 40 mum in diameter. Spectra collected with a microscope from eight points on a 200 mum grid were combined and in the resultant spectra the average value of the Raman band intensity ratio used to quantify the MDMA: caffeine ratio, mu(r), was 1.19 with an unacceptably high standard deviation, sigma(r), of 1.20. In contrast, with a conventional macro-Raman system (150 mum spot diameter), combined eight grid point data gave mu(r) = 1.47 with sigma(r) = 0.16. A simple statistical model which could be used to predict sigma(r) under the various conditions used was developed. The model showed that the decrease in sigma(r) on moving to a 150 mum spot was too large to be due entirely to the increased spot diameter but was consistent with the increased sampling volume that arose from a combination of the larger spot size and depth of focus in the macroscopic system. With the macro-Raman system, combining 64 grid points (0.5 mm spacing and 1-2 s accumulation per point) to give a single averaged spectrum for a tablet was found to be a practical balance between minimizing sampling errors and keeping overhead times at an acceptable level. The effectiveness of this sampling strategy was also tested by quantitative analysis of a set of model ecstasy tablets prepared from MDEA-sorbitol (0-30% by mass MDEA). A simple univariate calibration model of averaged 64 point data had R-2 = 0.998 and an r.m.s. standard error of prediction of 1.1% whereas data obtained by sampling just four points on the same tablet showed deviations from the calibration of up to 5%.
Resumo:
In this paper, by investigating the influence of source/drain extension region engineering (also known as gate-source/drain underlap) in nanoscale planar double gate (DG) SOI MOSFETs, we offer new insights into the design of future nanoscale gate-underlap DG devices to achieve ITRS projections for high performance (HP), low standby power (LSTP) and low operating power (LOP) logic technologies. The impact of high-kappa gate dielectric, silicon film thickness, together with parameters associated with the lateral source/drain doping profile, is investigated in detail. The results show that spacer width along with lateral straggle can not only effectively control short-channel effects, thus presenting low off-current in a gate underlap device, but can also be optimized to achieve lower intrinsic delay and higher on-off current ratio (I-on/I-off). Based on the investigation of on-current (I-on), off-current (I-off), I-on/I-off, intrinsic delay (tau), energy delay product and static power dissipation, we present design guidelines to select key device parameters to achieve ITRS projections. Using nominal gate lengths for different technologies, as recommended from ITRS specification, optimally designed gate-underlap DG MOSFETs with a spacer-to-straggle (s/sigma) ratio of 2.3 for HP/LOP and 3.2 for LSTP logic technologies will meet ITRS projection. However, a relatively narrow range of lateral straggle lying between 7 to 8 nm is recommended. A sensitivity analysis of intrinsic delay, on-current and off-current to important parameters allows a comparative analysis of the various design options and shows that gate workfunction appears to be the most crucial parameter in the design of DG devices for all three technologies. The impact of back gate misalignment on I-on, I-off and tau is also investigated for optimized underlap devices.
Resumo:
A problem with use of the geostatistical Kriging error for optimal sampling design is that the design does not adapt locally to the character of spatial variation. This is because a stationary variogram or covariance function is a parameter of the geostatistical model. The objective of this paper was to investigate the utility of non-stationary geostatistics for optimal sampling design. First, a contour data set of Wiltshire was split into 25 equal sub-regions and a local variogram was predicted for each. These variograms were fitted with models and the coefficients used in Kriging to select optimal sample spacings for each sub-region. Large differences existed between the designs for the whole region (based on the global variogram) and for the sub-regions (based on the local variograms). Second, a segmentation approach was used to divide a digital terrain model into separate segments. Segment-based variograms were predicted and fitted with models. Optimal sample spacings were then determined for the whole region and for the sub-regions. It was demonstrated that the global design was inadequate, grossly over-sampling some segments while under-sampling others.
Resumo:
A rapid design methodology for biorthogonal wavelet transform cores has been developed based on a generic, scaleable architecture for wavelet filters. The architecture offers efficient hardware utilisation by combining the linear phase property of biorthogonal filters with decimation in a MAC-based implementation. The design has been captured in VHDL and parameterised in terms of wavelet type, data word length and coefficient word length. The control circuit is embedded within the cores and allows them to be cascaded without any interface glue logic for any desired level of decomposition. The design time to produce silicon layout of a biorthogonal wavelet system is typically less than a day. The silicon cores produced are comparable in area and performance to hand-crafted designs, The designs are portable across a range of foundries and are also applicable to FPGA and PLD implementations.
Resumo:
This paper examines relevant characteristics of the ‘contested city’ and the concept of ‘public space’ in that problematic context. It offers an appraisal of the historical and contemporary role of urban design in shaping social space and interrogates the feasibility of using urban design to facilitate more integrated cityscapes. It presents detailed case studies of two ‘contested cities’, Nicosia and Belfast, based on content analysis of policy and planning documents, extensive site analyses in both places, interviews and seminar discussions with policy makers, planners, community and civic leaders. The paper comprises four dimensions—conceptual, descriptive, analytical and prescriptive—and in its final section identifies core values and relevant policies for the potential achievement of shared space in contested cities.
Resumo:
In this paper we define the structural information content of graphs as their corresponding graph entropy. This definition is based on local vertex functionals obtained by calculating-spheres via the algorithm of Dijkstra. We prove that the graph entropy and, hence, the local vertex functionals can be computed with polynomial time complexity enabling the application of our measure for large graphs. In this paper we present numerical results for the graph entropy of chemical graphs and discuss resulting properties. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
By means of the time dependent density matrix renormalization group algorithm we study the zero-temperature dynamics of the Von Neumann entropy of a block of spins in a Heisenberg chain after a sudden quench in the anisotropy parameter. In the absence of any disorder the block entropy increases linearly with time and then saturates. We analyse the velocity of propagation of the entanglement as a function of the initial and final anisotropies and compare our results, wherever possible, with those obtained by means of conformal field theory. In the disordered case we find a slower ( logarithmic) evolution which may signal the onset of entanglement localization.
Resumo:
A systematic computational fluid dynamics (CFD) approach has been applied to design the geometry of the channels of a three-dimensional (thick-walled) screen comprising upstream and downstream sets of elongated channels positioned at an angle of 90 degrees with respect to each other. Such a geometry of the thick-wall screen can effectively drop the ratio of the maximum flow velocity to mean flow velocity below 1.005 in a downstream microstructured reactor at low Reynolds numbers. In this approach the problem of flow equalization reduces to that of flow equalization in the first and second downstream channels of the thick-walled screen. In turn, this requires flow equalization in the corresponding cross-sections of the upstream channels. The validity of the proposed design method was assessed through a case study. The effect of different design parameters on the flow non-uniformity in the downstream channels has been established. The design equation is proposed to calculate the optimum values of the screen parameters. The CFD results on flow distribution were experimentally validated by Laser Doppler Anemometry measurements in the range of Reynolds numbers from 6 to 113. The measured flow non-uniformity in the separate reactor channels was below 2%.
Resumo:
OBJECTIVES: To assess the variation in practice of Barrett's esophagus (BE) management in comparison with accepted international guidelines before and after the introduction of a large BE randomized controlled trial (RCT) with protocols including those of tissue sampling.
DESIGN: A validated anonymized questionnaire was sent to 401 senior attending gastroenterologists asking for details of their current management of BE, especially histological sampling. Of the 228 respondents, 57 individuals (each from a different center) were in the first group to enter the ASPirin Esomeprazole (BE) Chemoprevention Trial (AspECT), and we assessed change in practice in these centers.
RESULTS: Ninety percent of specialists did not take adequate biopsies for histological diagnosis. Furthermore, 74% would consider aggressive surgical resection for prevalent cases of high-grade dysplasia in BE as their first-line choice despite the associated perioperative mortality. Ninety-two percent claim their lack of adherence to guidelines is because there is a need for stronger evidence for surveillance and medical interventions. Effect of the AspECT trial: Those clinicians in centers where the AspECT trial has started have improved adherence to ACG guidelines compared with their previous practice (P < 0.05). BE patients now get 18.8% more biopsies compared with previous practice, and 37.7% if the patient is entered into the AspECT trial (P < 0.01).
CONCLUSIONS: This large study indicates both wide variation in practice and poor compliance with guidelines. Because optimal histology is arguably the most important facet of BE management, the improvement in practice in centers taking part in the AspECT trial indicates an additional value of large international RCTs.
Resumo:
A rapid design methodology for biorthogonal wavelet transform cores has been developed. This methodology is based on a generic, scaleable architecture for the wavelet filters. The architecture offers efficient hardware utilization by combining the linear phase property of biorthogonal filters with decimation in a MAC based implementation. The design has been captured in VHDL and parameterized in terms of wavelet type, data word length and coefficient word length. The control circuit is embedded within the cores and allows them to be cascaded without any interface glue logic for any desired level of decomposition. The design time to produce silicon layout of a biorthogonal wavelet based system is typically less than a day. The resulting silicon cores produced are comparable in area and performance to hand-crafted designs. The designs are portable across a range of foundries and are also applicable to FPGA and PLD implementations.
Resumo:
The increasing need to understand complex products and systems with long life spans, presents a significant challenge to designers who increasingly require a broader understanding of the operational aspects of the system. This demands an evolution in current design practice, as designers are often constrained to provide a subsystem solution without full knowledge of the global system operation. Recently there has been a push to consider value centric approaches which should facilitate better or more rapid convergence to design solutions with predictable completion schedules. Value Driven Design is one such approach, in which value is used as the system top level objective function. This provides a broader view of the system and enables all sub-systems and components to be designed with a view to the effect on project value. It also has the capacity to include value expressions for more qualitative aspects, such as environmental impact. However, application of the method to date has been restricted to comparing value in a programme where the lifespan is fixed and known a priori. This paper takes a novel view of value driven design through the surplus value objective function, and shows how it can be used to identify key sensitivities to guide designers in design trade-off decisions. By considering a new time based approach it can be used to identify optimum programme life-span and hence allow trade-offs over the whole product life.
Resumo:
In this work, the use of a compliant web design for improved damage tolerance in stiffener run-outs is investigated. Firstly, a numerical study that incorporates the possibility of debonding and delamination (using VCCT) is used to select a favourable compliant run-out configuration. Then, three different configurations are compared to establish the merits of the compliant design: a baseline configuration, a configuration with optimised tapering and the selected compliant configuration. The performance of these configurations, in terms of strength and damage tolerance, was compared numerically using a parametric finite element analysis. The energy release rates for debonding and delamination, for different crack lengths across the specimen width, were used for this comparison. The three configurations were subsequently manufactured and tested. In order to monitor the failure process, acoustic emission (AE) equipment was used and proved valuable in the detection and analysis of failure. The predicted failure loads, based on the energy release rates, showed good agreement with the experiments, particularly when the distribution of energy release rate across the width of the specimen was taken into account. As predicted numerically, the compliant configuration failed by debonding and showed improved damage tolerance compared to the baseline and tapered stiffener run-outs.