951 resultados para error-location number


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Road collisions negatively affect the lives of hundreds of Canadians per year. Unfortunately, safety has been typically neglected from management systems. It is common to find that a great deal of effort has been devoted to develop and implement systems capable of achieving and sustaining good levels of condition. It is relatively recent that road safety has become an important objective. Managing a network of roads is not an easy task; it requires long, medium and short term plans to maintain, rehabilitate and upgrade aging assets, reduce and mitigate accident exposure, likelihood and severity. This thesis presents a basis for incorporating road safety into road management systems; two case studies were developed; one limited by available data and another from sufficient information. A long term analysis was used to allocate improvements for condition and safety of roads and bridges, at the network level. It was confirmed that a safety index could be used to obtain a first cut model; meanwhile potential for improvement which is a difference between observed and predicted number of accidents was capable of capturing the degree of safety of individual segments. It was found that the completeness of the system resulted in savings because of the economies obtained from trade-off optimization. It was observed that safety improvements were allocated at the beginning of the analysis in order to reduce the extent of issues, which translated into a systematic reduction of potential for improvement up to a point of near constant levels, which were hypothesized to relate to those unavoidable collisions from human error or vehicle failure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is well understood that that there is variation inherent in all testing techniques, and that all soil and rock materials also contain some degree of natural variability. Less consideration is normally given to variation associated with natural material heterogeneity within a site, or the relative condition of the material at the time of testing. This paper assesses the impact of spatial and temporal variability upon repeated insitu testing of a residual soil and rock profile present within a single residential site over a full calendar year, and thus range of seasonal conditions. From this repeated testing, the magnitude of spatial and temporal variation due to seasonal conditions has demonstrated that, depending on the selected location and moisture content of the subsurface at the time of testing, up to a 35% variation within the test results can be expected. The results have also demonstrated that the completed insitu test technique has a similarly large measurement and inherent variability error and, for the investigated site, up to a 60% variation in normalised results was observed. From these results, it is recommended that the frequency and timing of insitu tests should be considered when deriving geotechnical design parameters from a limited data set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bounds on the expectation and variance of errors at the output of a multilayer feedforward neural network with perturbed weights and inputs are derived. It is assumed that errors in weights and inputs to the network are statistically independent and small. The bounds obtained are applicable to both digital and analogue network implementations and are shown to be of practical value.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protein adsorption at solid-liquid interfaces is critical to many applications, including biomaterials, protein microarrays and lab-on-a-chip devices. Despite this general interest, and a large amount of research in the last half a century, protein adsorption cannot be predicted with an engineering level, design-orientated accuracy. Here we describe a Biomolecular Adsorption Database (BAD), freely available online, which archives the published protein adsorption data. Piecewise linear regression with breakpoint applied to the data in the BAD suggests that the input variables to protein adsorption, i.e., protein concentration in solution; protein descriptors derived from primary structure (number of residues, global protein hydrophobicity and range of amino acid hydrophobicity, isoelectric point); surface descriptors (contact angle); and fluid environment descriptors (pH, ionic strength), correlate well with the output variable-the protein concentration on the surface. Furthermore, neural network analysis revealed that the size of the BAD makes it sufficiently representative, with a neural network-based predictive error of 5% or less. Interestingly, a consistently better fit is obtained if the BAD is divided in two separate sub-sets representing protein adsorption on hydrophilic and hydrophobic surfaces, respectively. Based on these findings, selected entries from the BAD have been used to construct neural network-based estimation routines, which predict the amount of adsorbed protein, the thickness of the adsorbed layer and the surface tension of the protein-covered surface. While the BAD is of general interest, the prediction of the thickness and the surface tension of the protein-covered layers are of particular relevance to the design of microfluidics devices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The DVD, Jump into Number, was a joint project between Independent Schools Queensland, Queensland University of Technology and Catholic Education (Diocese of Cairns) aimed at improving mathematical practice in the early years. Independent Schools Queensland Executive Director Dr John Roulston said the invaluable teaching resource features a series of unscripted lessons which demonstrate the possibilities of learning among young Indigenous students. “Currently there is a lack of teaching resources for numeracy in younger students, especially from pre Prep to Year 3 which is such an important stage of a child’s early education. Jump into Number is a benchmark for all teachers to learn more about the mathematical development of younger students,” Dr Roulston said.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that the cluster ion concentration (CIC) in the atmosphere is significantly suppressed during events that involve rapid increases in particle number concentration (PNC). Using a neutral cluster and air ion spectrometer, we investigated changes in CIC during three types of particle enhancement processes – new particle formation, a bushfire episode and an intense pyrotechnic display. In all three cases, the total CIC decreased with increasing PNC, with the rate of decrease being greater for negative CIC than positive. We attribute this to the greater mobility, and hence the higher attachment coefficient, of negative ions over positive ions in the air. During the pyrotechnic display, the rapid increase in PNC was sufficient to reduce the CIC of both polarities to zero. At the height of the display, the negative CIC stayed at zero for a full 10 min. Although the PNCs were not significantly different, the CIC during new particle formation did not decrease as much as during the bushfire episode and the pyrotechnic display. We suggest that the rate of increase of PNC, together with particle size, also play important roles in suppressing CIC in the atmosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduced predators can have pronounced effects on naïve prey species; thus, predator control is often essential for conservation of threatened native species. Complete eradication of the predator, although desirable, may be elusive in budget-limited situations, whereas predator suppression is more feasible and may still achieve conservation goals. We used a stochastic predator-prey model based on a Lotka-Volterra system to investigate the cost-effectiveness of predator control to achieve prey conservation. We compared five control strategies: immediate eradication, removal of a constant number of predators (fixed-number control), removal of a constant proportion of predators (fixed-rate control), removal of predators that exceed a predetermined threshold (upper-trigger harvest), and removal of predators whenever their population falls below a lower predetermined threshold (lower-trigger harvest). We looked at the performance of these strategies when managers could always remove the full number of predators targeted by each strategy, subject to budget availability. Under this assumption immediate eradication reduced the threat to the prey population the most. We then examined the effect of reduced management success in meeting removal targets, assuming removal is more difficult at low predator densities. In this case there was a pronounced reduction in performance of the immediate eradication, fixed-number, and lower-trigger strategies. Although immediate eradication still yielded the highest expected minimum prey population size, upper-trigger harvest yielded the lowest probability of prey extinction and the greatest return on investment (as measured by improvement in expected minimum population size per amount spent). Upper-trigger harvest was relatively successful because it operated when predator density was highest, which is when predator removal targets can be more easily met and the effect of predators on the prey is most damaging. This suggests that controlling predators only when they are most abundant is the "best" strategy when financial resources are limited and eradication is unlikely. © 2008 Society for Conservation Biology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integration of biometrics is considered as an attractive solution for the issues associated with password based human authentication as well as for secure storage and release of cryptographic keys which is one of the critical issues associated with modern cryptography. However, the widespread popularity of bio-cryptographic solutions are somewhat restricted by the fuzziness associated with biometric measurements. Therefore, error control mechanisms must be adopted to make sure that fuzziness of biometric inputs can be sufficiently countered. In this paper, we have outlined such existing techniques used in bio-cryptography while explaining how they are deployed in different types of solutions. Finally, we have elaborated on the important facts to be considered when choosing appropriate error correction mechanisms for a particular biometric based solution.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nowadays, demand for automated Gas metal arc welding (GMAW) is growing and consequently need for intelligent systems is increased to ensure the accuracy of the procedure. To date, welding pool geometry has been the most used factor in quality assessment of intelligent welding systems. But, it has recently been found that Mahalanobis Distance (MD) not only can be used for this purpose but also is more efficient. In the present paper, Artificial Neural Networks (ANN) has been used for prediction of MD parameter. However, advantages and disadvantages of other methods have been discussed. The Levenberg–Marquardt algorithm was found to be the most effective algorithm for GMAW process. It is known that the number of neurons plays an important role in optimal network design. In this work, using trial and error method, it has been found that 30 is the optimal number of neurons. The model has been investigated with different number of layers in Multilayer Perceptron (MLP) architecture and has been shown that for the aim of this work the optimal result is obtained when using MLP with one layer. Robustness of the system has been evaluated by adding noise into the input data and studying the effect of the noise in prediction capability of the network. The experiments for this study were conducted in an automated GMAW setup that was integrated with data acquisition system and prepared in a laboratory for welding of steel plate with 12 mm in thickness. The accuracy of the network was evaluated by Root Mean Squared (RMS) error between the measured and the estimated values. The low error value (about 0.008) reflects the good accuracy of the model. Also the comparison of the predicted results by ANN and the test data set showed very good agreement that reveals the predictive power of the model. Therefore, the ANN model offered in here for GMA welding process can be used effectively for prediction goals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australia has a long history of policy attention to the education of poor and working-class youth (Connell, 1994), yet currently on standardized educational outcomes measures the gaps are widening in ways that relate to social background, including race, location and class. An economic analysis of school choice in Australia reveals that a high proportion of government school students now come from lower Socio-Economic Status (SES) backgrounds (Ryan & Watson, 2004), indicating a trend towards a gradual residualisation of the poor in government schools, with increased private school enrolments as a confirmed national trend. The spatial distribution of poverty and the effects on school populations are not unique to Australia (Lupton, 2003; Lipman, 2011; Ryan, 2010). Raffo and colleagues (2010) recently provided a synthesis of socially critical approaches towards schooling and poverty arguing that what is needed are shifts in the balances of power to reposition those within the educational system as having some say in the ways schooling is organized. ‘Disadvantaged’ primary schools are not a marginal concern for education systems, but now account for a large and growing number of schools that serve an ever increasing population being made redundant, in part-time precarious work, under-employed or unemployed (Thomson 2002; Smyth, Down et al 2010). In Australia, the notion of the ‘disadvantaged’ school now refers to those, mostly public schools, being residualised by a politics of parental choice that drives neoliberalising policy logic (Bonner & Caro 2007; Hattam & Comber, forthcoming 2014; Thomson & Reid, 2003)...