461 resultados para statistical techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from student design teams as well as two case studies, respectively of a prototype for supporting creative communication to design objects, and of stakeholder-involvement in early design. In all observations and case studies we found that non-formal techniques supported strong collaboration resulting in deep understanding of early design ideas, of their value and of the feasibility of solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Rolling Element Bearings (REBs) are vital components in rotating machineries for providing rotating motion. In slow speed rotating machines, bearings are normally subjected to heavy static loads and a catastrophic failure can cause enormous disruption to production and human safety. Due to its low operating speed the impact energy generated by the rotating elements on the defective components is not sufficient to produce a detectable vibration response. This is further aggravated by the inability of general measuring instruments to detect and process the weak signals at the initiation of the defect accurately. Furthermore, the weak signals are often corrupted by background noise. This is a serious problem faced by maintenance engineers today and the inability to detect an incipient failure of the machine can significantly increases the risk of functional failure and costly downtime. This paper presents the application of noise removal techniques for enhancing the detection capability for slow speed REB condition monitoring. Blind deconvolution (BD) and adaptive line enhancer (ALE) are compared to evaluate their performance in enhancing the source signal with consequential removal of background noise. In the experimental study, incipient defects were seeded on a number of roller bearings and the signals were acquired using acoustic emission (AE) sensor. Kurtosis and modified peak ratio (mPR) were used to determine the detectability of signal corrupted by noise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This qualitative study explores the methods that chefs use to create innovative marketable product and compares these findings to other design tools. This study is based on a series of interviews with locally recognised chefs in Minnesota and observations of them in their kitchens in order to understand the details of how they conceive and develop dishes from preliminary concept to final plating and user consumption. This paper focuses on idea generation and discusses two key findings: first, the variety of idea generation techniques presented by the chefs can be classified into the creativity tool SCAMPER (substitute, combine, adapt, modify/magnify, put to other use, eliminate, reverse/rearrange); second, chefs evoke the theory of MAYA or Most Advanced Yet Acceptable when innovating new dishes, which implies making novel changes while remaining relatable to the consumer. Other reoccurring topics in the interview discussion of food innovation include play, surprise, and humour.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Designing systems for multiple stakeholders requires frequent collaboration with multiple stakeholders from the start. In many cases at least some stakeholders lack a professional habit of formal modeling. We report observations from two case studies of stakeholder-involvement in early design where non-formal techniques supported strong collaboration resulting in deep understanding of requirements and of the feasibility of solutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bauxite refinery residues are derived from the Bayer process by the digestion of crushed bauxite in concentrated caustic at elevated temperatures. Chemically, it comprises, in varying amounts (depending upon the composition of the starting bauxite), oxides of iron and titanium, residual alumina, sodalite, silica, and minor quantities of other metal oxides. Bauxite residues are being neutralised by seawater in recent years to reduce the alkalinity in bauxite residue, through the precipitation of hydrotalcite-like compounds and some other Mg, Ca, and Al hydroxide and carbonate minerals. A combination of X-ray diffraction (XRD) and vibrational spectroscopy techniques, including mid-infrared (IR), Raman, near-infrared (NIR), and UV-Visible, have been used to characterise bauxite residue and seawater neutralised bauxite residue. Both the ferrous (Fe2+) and ferric (Fe3+) ions within bauxite residue can be identified by their characteristic NIR bands, where ferrous ions produce a strong absorption band at around 9000 cm-1, while ferric ions produce two strong bands at 25000 and 14300 cm-1. The presence of adsorbed carbonate and hydroxide anions can be identified at around 5200 and 7000 cm-1, respectively, attributed to the 2nd overtone of the 1st fundamental overtones observed in the mid-IR spectra. The complex bands in the Raman and mid-IR spectra around 3500 cm-1 are assigned to the OH stretching vibrations of the various oxides present in bauxite residue, and water. The combination of carbonate and hydroxyl units and their fundamental overtones give rise to many of the features of the NIR spectra.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Natural product provenance is important in the food, beverage and pharmaceutical industries, for consumer confidence and with health implications. Raman spectroscopy has powerful molecular fingerprint abilities. Surface Enhanced Raman Spectroscopy’s (SERS) sharp peaks allow distinction between minimally different molecules, so it should be suitable for this purpose. Methods Naturally caffeinated beverages with Guarana extract, coffee and Red Bull energy drink as a synthetic caffeinated beverage for comparison (20 µL ea.) were reacted 1:1 with Gold nanoparticles functionalised with anti-caffeine antibody (ab15221) (10 minutes), air dried and analysed in a micro-Raman instrument. The spectral data was processed using Principle Component Analysis (PCA). Results The PCA showed Guarana sourced caffeine varied significantly from synthetic caffeine (Red Bull) on component 1 (containing 76.4% of the variance in the data). See figure 1. The coffee containing beverages, and in particular Robert Timms (instant coffee) were very similar on component 1, but the barista espresso showed minor variance on component 1. Both coffee sourced caffeine samples varied with red Bull on component 2, (20% of variance). ************************************************************ Figure 1 PCA comparing a naturally caffeinated beverage containing Guarana with coffee. ************************************************************ Discussion PCA is an unsupervised multivariate statistical method that determines patterns within data. Figure 1 shows Caffeine in Guarana is notably different to synthetic caffeine. Other researchers have revealed that caffeine in Guarana plants is complexed with tannins. Naturally sourced/ lightly processed caffeine (Monster Energy, Espresso) are more inherently different than synthetic (Red Bull) /highly processed (Robert Timms) caffeine, in figure 1, which is consistent with this finding and demonstrates this technique’s applicability. Guarana provenance is important because it is still largely hand produced and its demand is escalating with recognition of its benefits. This could be a powerful technique for Guarana provenance, and may extend to other industries where provenance / authentication are required, e.g. the wine or natural pharmaceuticals industries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cotton strip assay (CSA) is an established technique for measuring soil microbial activity. The technique involves burying cotton strips and measuring their tensile strength after a certain time. This gives a measure of the rotting rate, R, of the cotton strips. R is then a measure of soil microbial activity. This paper examines properties of the technique and indicates how the assay can be optimised. Humidity conditioning of the cotton strips before measuring their tensile strength reduced the within and between day variance and enabled the distribution of the tensile strength measurements to approximate normality. The test data came from a three-way factorial experiment (two soils, two temperatures, three moisture levels). The cotton strips were buried in the soil for intervals of time ranging up to 6 weeks. This enabled the rate of loss of cotton tensile strength with time to be studied under a range of conditions. An inverse cubic model accounted for greater than 90% of the total variation within each treatment combination. This offers support for summarising the decomposition process by a single parameter R. The approximate variance of the decomposition rate was estimated from a function incorporating the variance of tensile strength and the differential of the function for the rate of decomposition, R, with respect to tensile strength. This variance function has a minimum when the measured strength is approximately 2/3 that of the original strength. The estimates of R are almost unbiased and relatively robust against the cotton strips being left in the soil for more or less than the optimal time. We conclude that the rotting rate X should be measured using the inverse cubic equation, and that the cotton strips should be left in the soil until their strength has been reduced to about 2/3.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recurrence relations in mathematics form a very powerful and compact way of looking at a wide range of relationships. Traditionally, the concept of recurrence has often been a difficult one for the secondary teacher to convey to students. Closely related to the powerful proof technique of mathematical induction, recurrences are able to capture many relationships in formulas much simpler than so-called direct or closed formulas. In computer science, recursive coding often has a similar compactness property, and, perhaps not surprisingly, suffers from similar problems in the classroom as recurrences: the students often find both the basic concepts and practicalities elusive. Using models designed to illuminate the relevant principles for the students, we offer a range of examples which use the modern spreadsheet environment to powerfully illustrate the great expressive and computational power of recurrences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We first classify the state-of-the-art stream authentication problem in the multicast environment and group them into Signing and MAC approaches. A new approach for authenticating digital streams using Threshold Techniques is introduced. The new approach main advantages are in tolerating packet loss, up to a threshold number, and having a minimum space overhead. It is most suitable for multicast applications running over lossy, unreliable communication channels while, in same time, are pertain the security requirements. We use linear equations based on Lagrange polynomial interpolation and Combinatorial Design methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents the field applications and validations for the controlled Monte Carlo data generation scheme. This scheme was previously derived to assist the Mahalanobis squared distance–based damage identification method to cope with data-shortage problems which often cause inadequate data multinormality and unreliable identification outcome. To do so, real-vibration datasets from two actual civil engineering structures with such data (and identification) problems are selected as the test objects which are then shown to be in need of enhancement to consolidate their conditions. By utilizing the robust probability measures of the data condition indices in controlled Monte Carlo data generation and statistical sensitivity analysis of the Mahalanobis squared distance computational system, well-conditioned synthetic data generated by an optimal controlled Monte Carlo data generation configurations can be unbiasedly evaluated against those generated by other set-ups and against the original data. The analysis results reconfirm that controlled Monte Carlo data generation is able to overcome the shortage of observations, improve the data multinormality and enhance the reliability of the Mahalanobis squared distance–based damage identification method particularly with respect to false-positive errors. The results also highlight the dynamic structure of controlled Monte Carlo data generation that makes this scheme well adaptive to any type of input data with any (original) distributional condition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. Lett., 73 (1994), pp.1311-1315; Phys. Rev. E, 54 (1996), pp.376-394] is presented in a pedagogical way to increase its visibility in applied mathematics and to argue favorably for its incorporation into the corresponding graduate curriculum.The method is illustrated by some linear and nonlinear singular perturbation problems. Key word. © 2012 Society for Industrial and Applied Mathematics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following the derivation of amplitude equations through a new two-time-scale method [O'Malley, R. E., Jr. & Kirkinis, E (2010) A combined renormalization group-multiple scale method for singularly perturbed problems. Stud. Appl. Math. 124, 383-410], we show that a multi-scale method may often be preferable for solving singularly perturbed problems than the method of matched asymptotic expansions. We illustrate this approach with 10 singularly perturbed ordinary and partial differential equations. © 2011 Cambridge University Press.