83 resultados para Zero sets of bivariate polynomials


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design of hot-rolled steel portal frames can be sensitive to serviceability deflection limits. In such cases, in order to reduce frame deflections, practitioners increase the size of the eaves haunch and / or the sizes of the steel sections used for the column and rafter members of the frame. This paper investigates the effect of such deflection limits using a real-coded niching genetic algorithm (RC-NGA) that optimizes frame weight, taking into account both ultimate as well as serviceability limit states. The results show that the proposed GA is efficient and reliable. Two different sets of serviceability deflection limits are then considered: deflection limits recommended by the Steel Construction Institute (SCI), which is based on control of differential deflections, and other deflection limits based on suggestions by industry. Parametric studies are carried out on frames with spans ranging between 15 m to 50 m and column heights between 5 m to 10 m. It is demonstrated that for a 50 m span frame, use of the SCI recommended deflection limits can lead to frame weights that are around twice as heavy as compared to designs without these limits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large data sets of radiocarbon dates are becoming a more common feature of archaeological research. The sheer numbers of radiocarbon dates produced, however, raise issues of representation and interpretation. This paper presents a methodology which both reduces the visible impact of dating fluctuations, but also takes into consideration the influence of the underlying radiocarbon calibration curve. By doing so, it may be possible to distinguish between periods of human activity in early medieval Ireland and the statistical tails produced by radiocarbon calibration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper evaluates the potential of gabions as roadside safety barriers. Gabions have the capacity to blend into natural landscape, suggesting that they could be used as a safety barrier for low-volume road in scenic environments. In fact, gabions have already been used for this purpose in Nepal, but the impact response was not evaluated. This paper reports on numerical and experimental investigations performed on a new gabion barrier prototype. To assess the potential use as a roadside barrier, the optimal gabion unit size and mass were investigated using multibody analysis and four sets of 1:4 scaled crash tests were carried out to study the local vehicle-barrier interaction. The barrier prototype was then finalised and subjected to a TB31 crash test according to the European EN1317 standard for N1 safety barriers. The test resulted in a failure due to the rollover of the vehicle and tearing of the gabion mesh yielding a large working width. It was found that although the system potentially has the necessary mass to contain a vehicle, the barrier front face does not have the necessary stiffness and strength to contain the gabion stone filling and hence redirect the vehicle. In the EN1317 test, the gabion barrier acted as a ramp for the impacting vehicle, causing rollover. 

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As a comparative newly-invented PKM with over-constraints in kinematic chains, the Exechon has attracted extensive attention from the research society. Different from the well-recognized kinematics analysis, the research on the stiffness characteristics of the Exechon still remains as a challenge due to the structural complexity. In order to achieve a thorough understanding of the stiffness characteristics of the Exechon PKM, this paper proposed an analytical kinetostatic model by using the substructure synthesis technique. The whole PKM system is decomposed into a moving platform subsystem, three limb subsystems and a fixed base subsystem, which are connected to each other sequentially through corresponding joints. Each limb body is modeled as a spatial beam with a uniform cross-section constrained by two sets of lumped springs. The equilibrium equation of each individual limb assemblage is derived through finite element formulation and combined with that of the moving platform derived with Newtonian method to construct the governing kinetostatic equations of the system after introducing the deformation compatibility conditions between the moving platform and the limbs. By extracting the 6 x 6 block matrix from the inversion of the governing compliance matrix, the stiffness of the moving platform is formulated. The computation for the stiffness of the Exechon PKM at a typical configuration as well as throughout the workspace is carried out in a quick manner with a piece-by-piece partition algorithm. The numerical simulations reveal a strong position-dependency of the PKM's stiffness in that it is symmetric relative to a work plane due to structural features. At the last stage, the effects of some design variables such as structural, dimensional and stiffness parameters on system rigidity are investigated with the purpose of providing useful information for the structural optimization and performance enhancement of the Exechon PKM. It is worthy mentioning that the proposed methodology of stiffness modeling in this paper can also be applied to other overconstrained PKMs and can evaluate the global rigidity over workplace efficiently with minor revisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mathematical modelling has become an essential tool in the design of modern catalytic systems. Emissions legislation is becoming increasingly stringent, and so mathematical models of aftertreatment systems must become more accurate in order to provide confidence that a catalyst will convert pollutants over the required range of conditions. 
Automotive catalytic converter models contain several sub-models that represent processes such as mass and heat transfer, and the rates at which the reactions proceed on the surface of the precious metal. Of these sub-models, the prediction of the surface reaction rates is by far the most challenging due to the complexity of the reaction system and the large number of gas species involved. The reaction rate sub-model uses global reaction kinetics to describe the surface reaction rate of the gas species and is based on the Langmuir Hinshelwood equation further developed by Voltz et al. [1] The reactions can be modelled using the pre-exponential and activation energies of the Arrhenius equations and the inhibition terms. 
The reaction kinetic parameters of aftertreatment models are found from experimental data, where a measured light-off curve is compared against a predicted curve produced by a mathematical model. The kinetic parameters are usually manually tuned to minimize the error between the measured and predicted data. This process is most commonly long, laborious and prone to misinterpretation due to the large number of parameters and the risk of multiple sets of parameters giving acceptable fits. Moreover, the number of coefficients increases greatly with the number of reactions. Therefore, with the growing number of reactions, the task of manually tuning the coefficients is becoming increasingly challenging. 
In the presented work, the authors have developed and implemented a multi-objective genetic algorithm to automatically optimize reaction parameters in AxiSuite®, [2] a commercial aftertreatment model. The genetic algorithm was developed and expanded from the code presented by Michalewicz et al. [3] and was linked to AxiSuite using the Simulink add-on for Matlab. 
The default kinetic values stored within the AxiSuite model were used to generate a series of light-off curves under rich conditions for a number of gas species, including CO, NO, C3H8 and C3H6. These light-off curves were used to generate an objective function. 
This objective function was used to generate a measure of fit for the kinetic parameters. The multi-objective genetic algorithm was subsequently used to search between specified limits to attempt to match the objective function. In total the pre-exponential factors and activation energies of ten reactions were simultaneously optimized. 
The results reported here demonstrate that, given accurate experimental data, the optimization algorithm is successful and robust in defining the correct kinetic parameters of a global kinetic model describing aftertreatment processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microbial interactions depend on a range of biotic and environmental variables, and are both dynamic and unpredictable. For some purposes, and under defined conditions, it is nevertheless imperative to evaluate the inhibitory efficacy of microbes, such as those with potential as biocontrol agents. We selected six, phylogenetically diverse microbes to determine their ability to inhibit the ascomycete Fusarium
coeruleum, a soil-dwelling pathogen of potato tubers that causes the storage disease dry rot. Interaction assays, where colony development was quantified (for both fungal pathogen and potential control agents), were therefore carried out on solid media. The key parameters that contributed to, and were indicative of, inhibitory efficacy were identified as: fungal growth-rates (i) prior to contact with the biocontrol
agent and (ii) if/once contact with the biocontrol agent was established (i.e. in the zone of mixed
culture), and (iii) the ultimate distance traveled by the fungal mycelium. It was clear that there was no correlation between zones of fungal inhibition and the overall reduction in the extent of fungal colony development. An inhibition coefficient was devised which incorporated the potential contributions of distal inhibition of fungal growth-rate; prevention of mycelium development in the vicinity of the biocontrol
agent; and ability to inhibit plant-pathogen growth-rate in the zone of mixed culture (in a ratio of 2:2:1). The values derived were 84.2 for Bacillus subtilis (QST 713), 74.0 for Bacillus sp. (JC12GB42), 30.7 for Pichia anomala (J121), 19.3 for Pantoea agglomerans (JC12GB34), 13.9 for Pantoea sp. (S09:T:12), and
21.9 (indicating a promotion of fungal growth) for bacterial strain (JC12GB54). This inhibition coefficient, with a theoretical maximum of 100, was consistent with the extent of F. coeruleum-colony development (i.e. area, in cm2) and assays of these biocontrol agents carried out previously against Fusarium
spp., and other fungi. These findings are discussed in relation to the dynamics and inherent complexity of natural ecosystems, and the need to adapt models for use under specific sets of conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this study, 39 sets of hard turning (HT) experimental trials were performed on a Mori-Seiki SL-25Y (4-axis) computer numerical controlled (CNC) lathe to study the effect of cutting parameters in influencing the machined surface roughness. In all the trials, AISI 4340 steel workpiece (hardened up to 69 HRC) was machined with a commercially available CBN insert (Warren Tooling Limited, UK) under dry conditions. The surface topography of the machined samples was examined by using a white light interferometer and a reconfirmation of measurement was done using a Form Talysurf. The machining outcome was used as an input to develop various regression models to predict the average machined surface roughness on this material. Three regression models - Multiple regression, Random Forest, and Quantile regression were applied to the experimental outcomes. To the best of the authors’ knowledge, this paper is the first to apply Random Forest or Quantile regression techniques to the machining domain. The performance of these models was compared to each other to ascertain how feed, depth of cut, and spindle speed affect surface roughness and finally to obtain a mathematical equation correlating these variables. It was concluded that the random forest regression model is a superior choice over multiple regression models for prediction of surface roughness during machining of AISI 4340 steel (69 HRC).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is increasing interest in how humans influence spatial patterns in biodiversity. One of the most frequently noted and marked of these patterns is the increase in species richness with area, the species-area relationship (SAR). SARs are used for a number of conservation purposes, including predicting extinction rates, setting conservation targets, and identifying biodiversity hotspots. Such applications can be improved by a detailed understanding of the factors promoting spatial variation in the slope of SARs, which is currently the subject of a vigorous debate. Moreover, very few studies have considered the anthropogenic influences on the slopes of SARs; this is particularly surprising given that in much of the world areas with high human population density are typically those with a high number of species, which generates conservation conflicts. Here we determine correlates of spatial variation in the slopes of species-area relationships, using the British avifauna as a case study. Whilst we focus on human population density, a widely used index of human activities, we also take into account (1) the rate of increase in habitat heterogeneity with increasing area, which is frequently proposed to drive SARs, (2) environmental energy availability, which may influence SARs by affecting species occupancy patterns, and (3) species richness. We consider environmental variables measured at both local (10 km x 10 km) and regional (290 km x 290 km) spatial grains, but find that the former consistently provides a better fit to the data. In our case study, the effect of species richness on the slope SARs appears to be scale dependent, being negative at local scales but positive at regional scales. In univariate tests, the slope of the SAR correlates negatively with human population density and environmental energy availability, and positively with the rate of increase in habitat heterogeneity. We conducted two sets of multiple regression analyses, with and without species richness as a predictor. When species richness is included it exerts a dominant effect, but when it is excluded temperature has the dominant effect on the slope of the SAR, and the effects of other predictors are marginal.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The spatial distribution of a species can be characterized at many different spatial scales, from fine-scale measures of local population density to coarse-scale geographical-range structure. Previous studies have shown a degree of correlation in species' distribution patterns across narrow ranges of scales, making it possible to predict fine-scale properties from coarser-scale distributions. To test the limits of such extrapolation, we have compiled distributional information on 16 species of British plants, at scales ranging across six orders of magnitude in linear resolution (1 in to 100 km). As expected, the correlation between patterns at different spatial scales tends to degrade as the scales become more widely separated. There is, however, an abrupt breakdown in cross-scale correlations across intermediate (ca. 0.5 km) scales, suggesting that local and regional patterns are influenced by essentially non-overlapping sets of processes. The scaling discontinuity may also reflect characteristic scales of human land use in Britain, suggesting a novel method for analysing the 'footprint' of humanity on a landscape.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emission lines of Be-like ions are frequently observed in astrophysical plasmas, and many are useful for density and temperature diagnostics. However, accurate atomic data for energy levels, radiative rates (A-values) and effective electron excitation collision strengths ($\Upsilon$) are required for reliable plasma modelling. In general it is reasonably straightforward to calculate energy levels and A- values to a high level of accuracy. By contrast, considerable effort is required to calculate $\Upsilon$, and hence it is not always possible to assess the accuracy of available data. Recently, two independent calculations (adopting the $R$-matrix method) but with different approaches (DARC and ICFT) have appeared for a range of Be-like ions. Therefore, in this work we compare the two sets of $\Upsilon$, highlight the large discrepancies for a significant number of transitions and suggest possible reasons for these.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two sets of issues in the area of law and religion have generated a large share of attention and controversy across a wide number of countries and jurisdictions in recent years. The first set of issues relates to the autonomy of churches and other religiously affiliated entities such as schools and social service organisations in their hiring and personnel decisions, involving the question of how far, if at all, such entities should be free from the influence and oversight of the state. The second set of issues involves the presence of religious symbols in the public sphere, such as in state schools or on public lands, involving the question of how far the state should be free from the influence of religion. Although these issues – freedom of religion from the state, and freedom of the state from religion – could be viewed as opposite sides of the same coin, they are almost always treated as separate lines of inquiry, and the implications of each for the other have not been the subject of much scrutiny. In this Introduction, we consider whether insights might be drawn from thinking about these issues both from a comparative law perspective and also from considering these two lines of cases together.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of an acid violet 7 (AV7) smart ink to assess the activity of photocatalytic paint is demonstrated. A linear correlation is established between the change in oxidized dye concentration, as measured by diffuse reflectance, and the change in the green component of the RGB color values, obtained using a portable hand-held scanner, suggesting that such tests can be monitored easily using an inexpensive piece of hand-held office equipment, as opposed to an expensive lab-based instrument, such as a diffuse reflectance UV/vis spectrophotometer. The bleaching of the AV7 follows first order kinetics, at a rate that is linearly dependent upon the UVA irradiance (0.30–3.26 mW cm–2). A comparison of relative rate of bleaching of the AV7 ink with the relative rate of removal of NOx, as determined using the ISO test (ISO 22197-1:2007), established a linear relationship between the two sets of results and the relevance of this correlation is discussed briefly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pseudomonas aeruginosa causes chronic lung infections in people with cystic fibrosis (CF) and acute opportunistic infections in people without CF. Forty two P. aeruginosa strains from a range of clinical and environmental sources were collated into a single reference strain panel to harmonise research on this diverse opportunistic pathogen. To facilitate further harmonized and comparable research on P. aeruginosa, we characterised the panel strains for growth rates, motility, virulence in the Galleria mellonella infection model, pyocyanin and alginate production, mucoid phenotype, lipopolysaccharide (LPS) pattern, biofilm formation, urease activity, antimicrobial and phage susceptibilities. Phenotypic diversity across the P. aeruginosa panel was apparent for all phenotypes examined agreeing with the marked variability seen in this species. However, except for growth rate, the phenotypic diversity among strains from CF versus non-CF sources was comparable. CF strains were less virulent in the G. mellonella model than non-CF strains (p=0.037). Transmissible CF strains generally lacked O antigen, produced less pyocyanin, and had low virulence in G. mellonella. Further, in the three sets of sequential CF strains, virulence, O-antigen expression and pyocyanin production were higher in the earlier isolate compared to the isolate obtained later in infection. Overall, full phenotypic characterization of the defined panel of P. aeruginosa strains increases our understanding of the virulence and pathogenesis of P. aeruginosa and may provide a valuable resource for the testing of novel therapies against this problematic pathogen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some reasons for registering trials might be considered as self-serving, such as satisfying the requirements of a journal in which the researchers wish to publish their eventual findings or publicising the trial to boost recruitment. Registry entries also help others, including systematic reviewers, to know about ongoing or unpublished studies and contribute to reducing research waste by making it clear what studies are ongoing. Other sources of research waste include inconsistency in outcome measurement across trials in the same area, missing data on important outcomes from some trials, and selective reporting of outcomes. One way to reduce this waste is through the use of core outcome sets: standardised sets of outcomes for research in specific areas of health and social care. These do not restrict the outcomes that will be measured, but provide the minimum to include if a trial is to be of the most use to potential users. We propose that trial registries, such as ISRCTN, encourage researchers to note their use of a core outcome set in their entry. This will help people searching for trials and those worried about selective reporting in closed trials. Trial registries can facilitate these efforts to make new trials as useful as possible and reduce waste. The outcomes section in the entry could prompt the researcher to consider using a core outcome set and facilitate the specification of that core outcome set and its component outcomes through linking to the original core outcome set. In doing this, registries will contribute to the global effort to ensure that trials answer important uncertainties, can be brought together in systematic reviews, and better serve their ultimate aim of improving health and well-being through improving health and social care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Promoter hypermethylation is recognized as a hallmark of human cancer, in addition to conventional mechanisms of gene inactivation. As such, many new technologies have been developed over the past two decades to uncover novel targets of methylation and decipher complex epigenetic patterns. However, many of these are either labor intensive or provide limited data, confined to oligonucleotide hybridization sequences or enzyme cleavage sites and cannot be easily applied to screening large sets of sequences or samples. We present an application of denaturing high performance liquid chromatography (DHPLC), which relies on bisulfite modification of genomic DNA, for methylation screening. We validated DHPLC as a methylation screening tool using GSTP1, a well known target of methylation in prostate cancer. We developed an in silico approach to identify potential targets of promoter hypermethylation in prostate cancer. Using DHPLC, we screened two of these targets LGALS3 and SMAD4 for methylation. We show that DHPLC has an application as a fast, sensitive, quantitative and cost effective method for screening novel targets or DNA samples for DNA methylation.