964 resultados para Probable Number Technique


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Herein the mechanical properties of graphene, including Young’s modulus, fracture stress and fracture strain have been investigated by molecular dynamics simulations. The simulation results show that the mechanical properties of graphene are sensitive to the temperature changes but insensitive to the layer numbers in the multilayer graphene. Increasing temperature exerts adverse and significant effects on the mechanical properties of graphene. However, the adverse effect produced by the increasing layer number is marginal. On the other hand, isotope substitutions in graphene play a negligible role in modifying the mechanical properties of graphene.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A newly developed computational approach is proposed in the paper for the analysis of multiple crack problems based on the eigen crack opening displacement (COD) boundary integral equations. The eigen COD particularly refers to a crack in an infinite domain under fictitious traction acting on the crack surface. With the concept of eigen COD, the multiple cracks in great number can be solved by using the conventional displacement discontinuity boundary integral equations in an iterative fashion with a small size of system matrix to determine all the unknown CODs step by step. To deal with the interactions among cracks for multiple crack problems, all cracks in the problem are divided into two groups, namely the adjacent group and the far-field group, according to the distance to the current crack in consideration. The adjacent group contains cracks with relatively small distances but strong effects to the current crack, while the others, the cracks of far-field group are composed of those with relatively large distances. Correspondingly, the eigen COD of the current crack is computed in two parts. The first part is computed by using the fictitious tractions of adjacent cracks via the local Eshelby matrix derived from the traction boundary integral equations in discretized form, while the second part is computed by using those of far-field cracks so that the high computational efficiency can be achieved in the proposed approach. The numerical results of the proposed approach are compared not only with those using the dual boundary integral equations (D-BIE) and the BIE with numerical Green's functions (NGF) but also with those of the analytical solutions in literature. The effectiveness and the efficiency of the proposed approach is verified. Numerical examples are provided for the stress intensity factors of cracks, up to several thousands in number, in both the finite and infinite plates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Undergraduate students studying the Bachelor of Radiation Therapy at Queensland University of Technology (QUT) attend clinical placements in a number of department sites across Queensland. To ensure that the curriculum prepares students for the most common treatments and current techniques in use in these departments, a curriculum matching exercise was performed. Methods: A cross-sectional census was performed on a pre-determined “Snapshot” date in 2012. This was undertaken by the clinical education staff in each department who used a standardized proforma to count the number of patients as well as prescription, equipment, and technique data for a list of tumour site categories. This information was combined into aggregate anonymized data. Results: All 12 Queensland radiation therapy clinical sites participated in the Snapshot data collection exercise to produce a comprehensive overview of clinical practice on the chosen day. A total of 59 different tumour sites were treated on the chosen day and as expected the most common treatment sites were prostate and breast, comprising 46% of patients treated. Data analysis also indicated that intensity-modulated radiotherapy (IMRT) use is relatively high with 19.6% of patients receiving IMRT treatment on the chosen day. Both IMRT and image-guided radiotherapy (IGRT) indications matched recommendations from the evidence. Conclusion: The Snapshot method proved to be a feasible and efficient method of gathering useful

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several approaches have been introduced in the literature for active noise control (ANC) systems. Since the filtered-x least-mean-square (FxLMS) algorithm appears to be the best choice as a controller filter, researchers tend to improve performance of ANC systems by enhancing and modifying this algorithm. This paper proposes a new version of the FxLMS algorithm, as a first novelty. In many ANC applications, an on-line secondary path modeling method using white noise as a training signal is required to ensure convergence of the system. As a second novelty, this paper proposes a new approach for on-line secondary path modeling on the basis of a new variable-step-size (VSS) LMS algorithm in feed forward ANC systems. The proposed algorithm is designed so that the noise injection is stopped at the optimum point when the modeling accuracy is sufficient. In this approach, a sudden change in the secondary path during operation makes the algorithm reactivate injection of the white noise to re-adjust the secondary path estimate. Comparative simulation results shown in this paper indicate the effectiveness of the proposed approach in reducing both narrow-band and broad-band noise. In addition, the proposed ANC system is robust against sudden changes of the secondary path model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a method for investigating ship emissions, the plume capture and analysis system (PCAS), and its application in measuring airborne pollutant emission factors (EFs) and particle size distributions. The current investigation was conducted in situ, aboard two dredgers (Amity: a cutter suction dredger and Brisbane: a hopper suction dredger) but the PCAS is also capable of performing such measurements remotely at a distant point within the plume. EFs were measured relative to the fuel consumption using the fuel combustion derived plume CO2. All plume measurements were corrected by subtracting background concentrations sampled regularly from upwind of the stacks. Each measurement typically took 6 minutes to complete and during one day, 40 to 50 measurements were possible. The relationship between the EFs and plume sample dilution was examined to determine the plume dilution range over which the technique could deliver consistent results when measuring EFs for particle number (PN), NOx, SO2, and PM2.5 within a targeted dilution factor range of 50-1000 suitable for remote sampling. The EFs for NOx, SO2, and PM2.5 were found to be independent of dilution, for dilution factors within that range. The EF measurement for PN was corrected for coagulation losses by applying a time dependant particle loss correction to the particle number concentration data. For the Amity, the EF ranges were PN: 2.2 - 9.6 × 1015 (kg-fuel)-1; NOx: 35-72 g(NO2).(kg-fuel)-1, SO2 0.6 - 1.1 g(SO2).(kg-fuel)-1and PM2.5: 0.7 – 6.1 g(PM2.5).(kg-fuel)-1. For the Brisbane they were PN: 1.0 – 1.5 x 1016 (kg-fuel)-1, NOx: 3.4 – 8.0 g(NO2).(kg-fuel)-1, SO2: 1.3 – 1.7 g(SO2).(kg-fuel)-1 and PM2.5: 1.2 – 5.6 g(PM2.5).(kg-fuel)-1. The results are discussed in terms of the operating conditions of the vessels’ engines. Particle number emission factors as a function of size as well as the count median diameter (CMD), and geometric standard deviation of the size distributions are provided. The size distributions were found to be consistently uni-modal in the range below 500 nm, and this mode was within the accumulation mode range for both vessels. The representative CMDs for the various activities performed by the dredgers ranged from 94-131 nm in the case of the Amity, and 58-80 nm for the Brisbane. A strong inverse relationship between CMD and EF(PN) was observed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A satellite based observation system can continuously or repeatedly generate a user state vector time series that may contain useful information. One typical example is the collection of International GNSS Services (IGS) station daily and weekly combined solutions. Another example is the epoch-by-epoch kinematic position time series of a receiver derived by a GPS real time kinematic (RTK) technique. Although some multivariate analysis techniques have been adopted to assess the noise characteristics of multivariate state time series, statistic testings are limited to univariate time series. After review of frequently used hypotheses test statistics in univariate analysis of GNSS state time series, the paper presents a number of T-squared multivariate analysis statistics for use in the analysis of multivariate GNSS state time series. These T-squared test statistics have taken the correlation between coordinate components into account, which is neglected in univariate analysis. Numerical analysis was conducted with the multi-year time series of an IGS station to schematically demonstrate the results from the multivariate hypothesis testing in comparison with the univariate hypothesis testing results. The results have demonstrated that, in general, the testing for multivariate mean shifts and outliers tends to reject less data samples than the testing for univariate mean shifts and outliers under the same confidence level. It is noted that neither univariate nor multivariate data analysis methods are intended to replace physical analysis. Instead, these should be treated as complementary statistical methods for a prior or posteriori investigations. Physical analysis is necessary subsequently to refine and interpret the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Production of nanofibrous polyacrylonitrile/calcium carbonate (PAN/CaCO3) nanocomposite web was carried out through solution electrospinning process. Pore generating nanoparticles were leached from the PAN matrices in hydrochloric acid bath with the purpose of producing an ultimate nanoporous structure. The possible interaction between CaCO3 nanoparticles and PAN functional groups was investigated. Atomic absorption method was used to measure the amount of extracted CaCO3 nanoparticles. Morphological observation showed nanofibers of 270–720 nm in diameter containing nanopores of 50–130 nm. Monitoring the governing parameters statistically, it was found that the amount of extraction (ε) of CaCO3was increased when the web surface area (a) was broadened according to a simple scaling law (ε = 3.18 a0.4). The leaching process was maximized in the presence of 5% v/v of acid in the extraction bath and 5 wt % of CaCO3 in the polymer solution. Collateral effects of the extraction time and temperature showed exponential growth within a favorable extremum at 50°C for 72 h. Concentration of dimethylformamide as the solvent had no significant impact on the extraction level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laminar two-dimensional natural convection boundary-layer flow of non-Newtonian fluids along an isothermal horizontal circular cylinder has been studied using a modified power-law viscosity model. In this model, there are no unrealistic limits of zero or infinite viscosity. Therefore, the boundary-layer equations can be solved numerically by using marching order implicit finite difference method with double sweep technique. Numerical results are presented for the case of shear-thinning as well as shear thickening fluids in terms of the fluid velocity and temperature distributions, shear stresses and rate of heat transfer in terms of the local skin-friction and local Nusselt number respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several approaches have been introduced in literature for active noise control (ANC) systems. Since FxLMS algorithm appears to be the best choice as a controller filter, researchers tend to improve performance of ANC systems by enhancing and modifying this algorithm. This paper proposes a new version of FxLMS algorithm. In many ANC applications an online secondary path modelling method using a white noise as a training signal is required to ensure convergence of the system. This paper also proposes a new approach for online secondary path modelling in feedfoward ANC systems. The proposed algorithm stops injection of the white noise at the optimum point and reactivate the injection during the operation, if needed, to maintain performance of the system. Benefiting new version of FxLMS algorithm and not continually injection of white noise makes the system more desirable and improves the noise attenuation performance. Comparative simulation results indicate effectiveness of the proposed approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has not yet been established whether the spatial variation of particle number concentration (PNC) within a microscale environment can have an effect on exposure estimation results. In general, the degree of spatial variation within microscale environments remains unclear, since previous studies have only focused on spatial variation within macroscale environments. The aims of this study were to determine the spatial variation of PNC within microscale school environments, in order to assess the importance of the number of monitoring sites on exposure estimation. Furthermore, this paper aims to identify which parameters have the largest influence on spatial variation, as well as the relationship between those parameters and spatial variation. Air quality measurements were conducted for two consecutive weeks at each of the 25 schools across Brisbane, Australia. PNC was measured at three sites within the grounds of each school, along with the measurement of meteorological and several other air quality parameters. Traffic density was recorded for the busiest road adjacent to the school. Spatial variation at each school was quantified using coefficient of variation (CV). The portion of CV associated with instrument uncertainty was found to be 0.3 and therefore, CV was corrected so that only non-instrument uncertainty was analysed in the data. The median corrected CV (CVc) ranged from 0 to 0.35 across the schools, with 12 schools found to exhibit spatial variation. The study determined the number of required monitoring sites at schools with spatial variability and tested the deviation in exposure estimation arising from using only a single site. Nine schools required two measurement sites and three schools required three sites. Overall, the deviation in exposure estimation from using only one monitoring site was as much as one order of magnitude. The study also tested the association of spatial variation with wind speed/direction and traffic density, using partial correlation coefficients to identify sources of variation and non-parametric function estimation to quantify the level of variability. Traffic density and road to school wind direction were found to have a positive effect on CVc, and therefore, also on spatial variation. Wind speed was found to have a decreasing effect on spatial variation when it exceeded a threshold of 1.5 (m/s), while it had no effect below this threshold. Traffic density had a positive effect on spatial variation and its effect increased until it reached a density of 70 vehicles per five minutes, at which point its effect plateaued and did not increase further as a result of increasing traffic density.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, recommender systems (RS) have been widely applied in many commercial e-commerce sites to help users deal with the information overload problem. Recommender systems provide personalized recommendations to users and thus help them in making good decisions about which product to buy from the vast number of product choices available to them. Many of the current recommender systems are developed for simple and frequently purchased products like books and videos, by using collaborative-filtering and content-based recommender system approaches. These approaches are not suitable for recommending luxurious and infrequently purchased products as they rely on a large amount of ratings data that is not usually available for such products. This research aims to explore novel approaches for recommending infrequently purchased products by exploiting user generated content such as user reviews and product click streams data. From reviews on products given by the previous users, association rules between product attributes are extracted using an association rule mining technique. Furthermore, from product click streams data, user profiles are generated using the proposed user profiling approach. Two recommendation approaches are proposed based on the knowledge extracted from these resources. The first approach is developed by formulating a new query from the initial query given by the target user, by expanding the query with the suitable association rules. In the second approach, a collaborative-filtering recommender system and search-based approaches are integrated within a hybrid system. In this hybrid system, user profiles are used to find the target user’s neighbour and the subsequent products viewed by them are then used to search for other relevant products. Experiments have been conducted on a real world dataset collected from one of the online car sale companies in Australia to evaluate the effectiveness of the proposed recommendation approaches. The experiment results show that user profiles generated from user click stream data and association rules generated from user reviews can improve recommendation accuracy. In addition, the experiment results also prove that the proposed query expansion and the hybrid collaborative filtering and search-based approaches perform better than the baseline approaches. Integrating the collaborative-filtering and search-based approaches has been challenging as this strategy has not been widely explored so far especially for recommending infrequently purchased products. Therefore, this research will provide a theoretical contribution to the recommender system field as a new technique of combining collaborative-filtering and search-based approaches will be developed. This research also contributes to a development of a new query expansion technique for infrequently purchased products recommendation. This research will also provide a practical contribution to the development of a prototype system for recommending cars.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a unified view of the relationship between (1) quantity and (2) price generating mechanisms in estimating individual prime construction costs/prices. A brief review of quantity generating techniques is provided with particular emphasis on experientially based assumptive approaches and this is compared with the level of pricing data available for the quantities generated in terms of reliability of the ensuing prime cost estimates. It is argued that there is a tradeoff between the reliability of quantity items and reliability of rates. Thus it is shown that the level of quantity generation is optimised by maximising the joint reliability function of the quantity items and their associated rates. Some thoughts on how this joint reliability function can be evaluated and quantified follow. The application of these ideas is described within the overall strategy of the estimator's decision - "Which estimating technique shall I use for a given level of contract information? - and a case is made for the computer generation of estimates by several methods, with an indication of the reliability of each estimate, the ultimate choice of estimate being left to the estimator concerned. Finally, the potential for the development of automatic estimating systems within this framework is examined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two approaches are described, which aid the selection of the most appropriate procurement arrangements for a building project. The first is a multi-attribute technique based on the National Economic Development Office procurement path decision chart. A small study is described in which the utility factors involved were weighted by averaging the scores of five 'experts' for three hypothetical building projects. A concordance analysis is used to provide some evidence of any abnormal data sources. When applied to the study data, one of the experts was seen to be atypical. The second approach is by means of discriminant analysis. This was found to provide reasonably consistent predictions through three discriminant functions. The analysis also showed the quality criteria to have no significant impact on the decision process. Both approaches provided identical and intuitively correct answers in the study described. Some concluding remarks are made on the potential of discriminant analysis for future research and development in procurement selection techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Studies on the relationship between performance and design of the throwing frame have been limited and therefore require further investigation. Objectives: The specific objectives were to provide benchmark information about performance and whole body positioning of male athletes in F30s classes. Study Design: Descriptive analysis. Methods: A total of 48 attempts performed by 12 stationary discus throwers in F33 and F34 classes during seated discus throwing event of 2002 International Paralympic Committee Athletics World Championships were analysed in this study. The whole body positioning included overall throwing posture (i.e. number of points of contact between the thrower and the frame, body position, throwing orientation and throwing side) and lower limb placements (i.e. seating arrangements, points of contact on the both feet, type of attachment of both legs and feet). Results: Three (25%), five (42%), one (8%) and three (25%) athletes used from three to six points of contact, respectively. Seven (58%) and five (42%) athletes threw from a standing or a seated position, respectively. A straddle, a stool or a chair was used by six (50%), four (33%) or two (17%) throwers, respectively. Conclusions: This study provides key information for a better understanding of the interaction between throwing technique of elite seated throwers and their throwing frame.