948 resultados para Successive Overrelaxation method with 1 parameter


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new approach for the control of the size of particles fabricated using the Electrohydrodynamic Atomization (EHDA) method is being developed. In short, the EHDA process produces solution droplets in a controlled manner, and as the solvent evaporates from the surface of the droplets, polymeric particles are formed. By varying the voltage applied, the size of the droplets can be changed, and consequently, the size of the particles can also be controlled. By using both a nozzle electrode and a ring electrode placed axisymmetrically and slightly above the nozzle electrode, we are able to produce a Single Taylor Cone Single Jet for a wide range of voltages, contrary to just using a single nozzle electrode where the range of permissible voltage for the creation of the Single Taylor Cone Single Jet is usually very small. Phase Doppler Particle Analyzer (PDPA) test results have shown that the droplet size increases with increasing voltage applied. This trend is predicted by the electrohydrodynamic theory of the Single Taylor Cone Single Jet based on a perfect dielectric fluid model. Particles fabricated using different voltages do not show much change in the particles size, and this may be attributed to the solvent evaporation process. Nevertheless, these preliminary results do show that this method has the potential of providing us with a way of fine controlling the particles size using relatively simple method with trends predictable by existing theories.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sixty-one animals with different Halothane genes (homozygous halothane positive, n=34; and homozygous halothane negative, n=27) were fed with three diets (controlgroup, with no supplement; magnesium (Mg) group with 1.28g MgCO3/kg and tryptophan (Trp) group with 5g L-Trp/kg) during the last 5 days before slaughter. Animals were submitted to minimal stress ante mortem conditions. Pig behaviour was recorded at the experimental farm, raceway to the CO2 stunning system and during the stunning period. Corneal reflexes were recorded after stunning as well. There were no differences in feed intake among diets (p>0.05) during the 5 days of treatment. The halothane positive (nn) group had lower intake than the halothane negative (NN) group (p<0.01). The behaviour of the pigs in the raceway did not differ (p>0.05) among treatments or halothane genotype. A significant (p<0.001) interaction diet*halothane was found in the time to appear the first retreat attempt during the exposure to the CO2 system. In the nn group, the time of performing the first retreat attempt was later in the Mg (p<0.05) than the Control group. Moreover, in the Mg group, the nn had a later (p<0.05) first retreat attempt than the NN. Thus, Mg supplementation could have a positive effect on welfare of nn pigs. The nn had a lower proportion of animals that showed corneal reflexes after stunning than NN, indicating a higher effectiveness of the stunning method in nn pigs. Neither Mg nor Trp affected carcass quality and meat quality parameters, although significant differences were found between genotypes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stable isotopic characterization of chlorine in chlorinated aliphatic pollution is potentially very valuable for risk assessment and monitoring remediation or natural attenuation. The approach has been underused because of the complexity of analysis and the time it takes. We have developed a new method that eliminates sample preparation. Gas chromatography produces individually eluted sample peaks for analysis. The He carrier gas is mixed with Ar and introduced directly into the torch of a multicollector ICPMS. The MC-ICPMS is run at a high mass resolution of >= 10 000 to eliminate interference of mass 37 ArH with Cl. The standardization approach is similar to that for continuous flow stable isotope analysis in which sample and reference materials are measured successively. We have measured PCE relative to a laboratory TCE standard mixed with the sample. Solvent samples of 200 nmol to 1.3 mu mol ( 24- 165 mu g of Cl) were measured. The PCE gave the same value relative to the TCE as measured by the conventional method with a precision of 0.12% ( 2 x standard error) but poorer precision for the smaller samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss-Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider scattering of a time harmonic incident plane wave by a convex polygon with piecewise constant impedance boundary conditions. Standard finite or boundary element methods require the number of degrees of freedom to grow at least linearly with respect to the frequency of the incident wave in order to maintain accuracy. Extending earlier work by Chandler-Wilde and Langdon for the sound soft problem, we propose a novel Galerkin boundary element method, with the approximation space consisting of the products of plane waves with piecewise polynomials supported on a graded mesh with smaller elements closer to the corners of the polygon. Theoretical analysis and numerical results suggest that the number of degrees of freedom required to achieve a prescribed level of accuracy grows only logarithmically with respect to the frequency of the incident wave.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we consider boundary integral methods applied to boundary value problems for the positive definite Helmholtz-type problem -DeltaU + alpha U-2 = 0 in a bounded or unbounded domain, with the parameter alpha real and possibly large. Applications arise in the implementation of space-time boundary integral methods for the heat equation, where alpha is proportional to 1/root deltat, and deltat is the time step. The corresponding layer potentials arising from this problem depend nonlinearly on the parameter alpha and have kernels which become highly peaked as alpha --> infinity, causing standard discretization schemes to fail. We propose a new collocation method with a robust convergence rate as alpha --> infinity. Numerical experiments on a model problem verify the theoretical results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently there are few observations of the urban wind field at heights other than rooftop level. Remote sensing instruments such as Doppler lidars provide wind speed data at many heights, which would be useful in determining wind loadings of tall buildings, and predicting local air quality. Studies comparing remote sensing with traditional anemometers carried out in flat, homogeneous terrain often use scan patterns which take several minutes. In an urban context the flow changes quickly in space and time, so faster scans are required to ensure little change in the flow over the scan period. We compare 3993 h of wind speed data collected using a three-beam Doppler lidar wind profiling method with data from a sonic anemometer (190 m). Both instruments are located in central London, UK; a highly built-up area. Based on wind profile measurements every 2 min, the uncertainty in the hourly mean wind speed due to the sampling frequency is 0.05–0.11 m s−1. The lidar tended to overestimate the wind speed by ≈0.5 m s−1 for wind speeds below 20 m s−1. Accuracy may be improved by increasing the scanning frequency of the lidar. This method is considered suitable for use in urban areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An extensive off-line evaluation of the Noah/Single Layer Urban Canopy Model (Noah/SLUCM) urban land-surface model is presented using data from 15 sites to assess (1) the ability of the scheme to reproduce the surface energy balance observed in a range of urban environments, including seasonal changes, and (2) the impact of increasing complexity of input parameter information. Model performance is found to be most dependent on representation of vegetated surface area cover; refinement of other parameter values leads to smaller improvements. Model biases in net all-wave radiation and trade-offs between turbulent heat fluxes are highlighted using an optimization algorithm. Here we use the Urban Zones to characterize Energy partitioning (UZE) as the basis to assign default SLUCM parameter values. A methodology (FRAISE) to assign sites (or areas) to one of these categories based on surface characteristics is evaluated. Using three urban sites from the Basel Urban Boundary Layer Experiment (BUBBLE) dataset, an independent evaluation of the model performance with the parameter values representative of each class is performed. The scheme copes well with both seasonal changes in the surface characteristics and intra-urban heterogeneities in energy flux partitioning, with RMSE performance comparable to similar state-of-the-art models for all fluxes, sites and seasons. The potential of the methodology for high-resolution atmospheric modelling application using the Weather Research and Forecasting (WRF) model is highlighted. This analysis supports the recommendations that (1) three classes are appropriate to characterize the urban environment, and (2) that the parameter values identified should be adopted as default values in WRF.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semi-analytical expressions for the momentum flux associated with orographic internal gravity waves, and closed analytical expressions for its divergence, are derived for inviscid, stationary, hydrostatic, directionally-sheared flow over mountains with an elliptical horizontal cross-section. These calculations, obtained using linear theory conjugated with a third-order WKB approximation, are valid for relatively slowly-varying, but otherwise generic wind profiles, and given in a form that is straightforward to implement in drag parametrization schemes. When normalized by the surface drag in the absence of shear, a quantity that is calculated routinely in existing drag parametrizations, the momentum flux becomes independent of the detailed shape of the orography. Unlike linear theory in the Ri → ∞ limit, the present calculations account for shear-induced amplification or reduction of the surface drag, and partial absorption of the wave momentum flux at critical levels. Profiles of the normalized momentum fluxes obtained using this model and a linear numerical model without the WKB approximation are evaluated and compared for two idealized wind profiles with directional shear, for different Richardson numbers (Ri). Agreement is found to be excellent for the first wind profile (where one of the wind components varies linearly) down to Ri = 0.5, while not so satisfactory, but still showing a large improvement relative to the Ri → ∞ limit, for the second wind profile (where the wind turns with height at a constant rate keeping a constant magnitude). These results are complementary, in the Ri > O(1) parameter range, to Broad’s generalization of the Eliassen–Palm theorem to 3D flow. They should contribute to improve drag parametrizations used in global weather and climate prediction models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective cyclone tracking applied to a 30-yr reanalysis dataset shows that cyclone development in the summer and autumn seasons is active in the tropics and extratropics and inactive in the subtropics. To understand this geographically bimodal distribution of cyclone development associated with tropical and extratropical cyclones quantitatively, the direct relationship between cyclone types and their environments are assessed by using a parameter space of environmental variables [environmental parameter space (EPS)]. The number of cyclones is analyzed in terms of two different factors: the environmental conditions favorable for cyclone development and the area size that satisfies the favorable condition. The EPS analysis is mainly conducted for two representative environmental parameters that are commonly used for cyclone analysis: potential intensity for tropical cyclones and baroclinicity for extratropical cyclones. The geographically bimodal distribution is attributed to the high sensitivity of the cyclone development to the change in the environmental fields from tropics to extratropics. In addition, the bimodal distribution is partly attributed to the rapid change in the environmental fields from tropics to extratropics. The EPS analysis also shows that other environmental parameters, including relative humidity and vertical velocity, may enhance the contrast between the tropics (extratropics) and subtropics, whereas they are not essential for determining cyclone types. The relationship between cyclones and their environments is found to be similar between the hemispheres in the EPS, although the geographical distribution, particularly the longitudinal uniformity, is markedly different between the hemispheres.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We establish a general framework for a class of multidimensional stochastic processes over [0,1] under which with probability one, the signature (the collection of iterated path integrals in the sense of rough paths) is well-defined and determines the sample paths of the process up to reparametrization. In particular, by using the Malliavin calculus we show that our method applies to a class of Gaussian processes including fractional Brownian motion with Hurst parameter H>1/4, the Ornstein–Uhlenbeck process and the Brownian bridge.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is an assessment of frequency of extreme values (EVs) of daily rainfall in the city of Sao Paulo. Brazil, over the period 1933-2005, based on the peaks-over-threshold (POT) and Generalized Pareto Distribution (GPD) approach. Usually. a GPD model is fitted to a sample of POT Values Selected With a constant threshold. However. in this work we use time-dependent thresholds, composed of relatively large p quantities (for example p of 0.97) of daily rainfall amounts computed from all available data. Samples of POT values were extracted with several Values of p. Four different GPD models (GPD-1, GPD-2, GPD-3. and GDP-4) were fitted to each one of these samples by the maximum likelihood (ML) method. The shape parameter was assumed constant for the four models, but time-varying covariates were incorporated into scale parameter of GPD-2. GPD-3, and GPD-4, describing annual cycle in GPD-2. linear trend in GPD-3, and both annual cycle and linear trend in GPD-4. The GPD-1 with constant scale and shape parameters is the simplest model. For identification of the best model among the four models WC used rescaled Akaike Information Criterion (AIC) with second-order bias correction. This criterion isolates GPD-3 as the best model, i.e. the one with positive linear trend in the scale parameter. The slope of this trend is significant compared to the null hypothesis of no trend, for about 98% confidence level. The non-parametric Mann-Kendall test also showed presence of positive trend in the annual frequency of excess over high thresholds. with p-value being virtually zero. Therefore. there is strong evidence that high quantiles of daily rainfall in the city of Sao Paulo have been increasing in magnitude and frequency over time. For example. 0.99 quantiles of daily rainfall amount have increased by about 40 mm between 1933 and 2005. Copyright (C) 2008 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new approach for solving the optimal power flow (OPF) problem is established by combining the reduced gradient method and the augmented Lagrangian method with barriers and exploring specific characteristics of the relations between the variables of the OPF problem. Computer simulations on IEEE 14-bus and IEEE 30-bus test systems illustrate the method. (c) 2007 Elsevier Inc. All rights reserved.