469 resultados para Probable Number Technique


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most surgeons cement the tibial component in total knee replacement surgery. Mid-term registry data from a number of countries, including those from the United Kingdom and Australia, support the excellent survivorship of cemented tibial components. In spite of this success, results can always be improved, and cementing technique can play a role. Cementing technique on the tibia is not standardized, and surgeons still differ about the best ways to deliver cement into the cancellous bone of the upper tibia. Questions remain regarding whether to use a gun or a syringe to inject the cement into the cancellous bone of the tibial plateau . The ideal cement penetration into the tibial plateau is debated, though most reports suggest that 4 mm to 10 mm is ideal. Thicker mantles are thought to be dangerous due to the risk of bone necrosis, but there is little in the literature to support this contention...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis proposes three novel models which extend the statistical methodology for motor unit number estimation, a clinical neurology technique. Motor unit number estimation is important in the treatment of degenerative muscular diseases and, potentially, spinal injury. Additionally, a recent and untested statistic to enable statistical model choice is found to be a practical alternative for larger datasets. The existing methods for dose finding in dual-agent clinical trials are found to be suitable only for designs of modest dimensions. The model choice case-study is the first of its kind containing interesting results using so-called unit information prior distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In an estuary, mixing and dispersion are the result of the combination of large scale advection and small scale turbulence which are both complex to estimate. A field study was conducted in a small sub-tropical estuary in which high frequency (50 Hz) turbulent data were recorded continuously for about 48 hours. A triple decomposition technique was introduced to isolate the contributions of tides, resonance and turbulence in the flow field. A striking feature of the data set was the slow fluctuations which exhibited large amplitudes up to 50% the tidal amplitude under neap tide conditions. The triple decomposition technique allowed a characterisation of broader temporal scales of high frequency fluctuation data sampled during a number of full tidal cycles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The estimation of the critical gap has been an issue since the 1970s, when gap acceptance was introduced to evaluate the capacity of unsignalized intersections. The critical gap is the shortest gap that a driver is assumed to accept. A driver’s critical gap cannot be measured directly and a number of techniques have been developed to estimate the mean critical gaps of a sample of drivers. This paper reviews the ability of the Maximum Likelihood technique and the Probability Equilibrium Method to predict the mean and standard deviation of the critical gap with a simulation of 100 drivers, repeated 100 times for each flow condition. The Maximum Likelihood method gave consistent and unbiased estimates of the mean critical gap. Whereas the probability equilibrium method had a significant bias that was dependent on the flow in the priority stream. Both methods were reasonably consistent, although the Maximum Likelihood Method was slightly better. If drivers are inconsistent, then again the Maximum Likelihood method is superior. A criticism levelled at the Maximum Likelihood method is that a distribution of the critical gap has to be assumed. It was shown that this does not significantly affect its ability to predict the mean and standard deviation of the critical gaps. Finally, the Maximum Likelihood method can predict reasonable estimates with observations for 25 to 30 drivers. A spreadsheet procedure for using the Maximum Likelihood method is provided in this paper. The PEM can be improved if the maximum rejected gap is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The DVD, Jump into Number, was a joint project between Independent Schools Queensland, Queensland University of Technology and Catholic Education (Diocese of Cairns) aimed at improving mathematical practice in the early years. Independent Schools Queensland Executive Director Dr John Roulston said the invaluable teaching resource features a series of unscripted lessons which demonstrate the possibilities of learning among young Indigenous students. “Currently there is a lack of teaching resources for numeracy in younger students, especially from pre Prep to Year 3 which is such an important stage of a child’s early education. Jump into Number is a benchmark for all teachers to learn more about the mathematical development of younger students,” Dr Roulston said.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This monograph provides an overview of recruitment learning approaches from a computational perspective. Recruitment learning is a unique machine learning technique that: (1) explains the physical or functional acquisition of new neurons in sparsely connected networks as a biologically plausible neural network method; (2) facilitates the acquisition of new knowledge to build and extend knowledge bases and ontologies as an artificial intelligence technique; (3) allows learning by use of background knowledge and a limited number of observations, consistent with psychological theory.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A monolithic stationary phase was prepared via free radical co-polymerization of ethylene glycol dimethacrylate (EDMA) and glycidyl methacrylate (GMA) with pore diameter tailored specifically for plasmid binding, retention and elution. The polymer was functionalized. with 2-chloro-N,N-diethylethylamine hydrochloride (DEAE-Cl) for anion-exchange purification of plasmid DNA (pDNA) from clarified lysate obtained from E. coli DH5α-pUC19 culture in a ribonuclease/ protease-free environment. Characterization of the monolithic resin showed a porous material, with 68% of the pores existing in the matrix having diameters above 300 nm. The final product isolated from a single-stage 5 min anion-exchange purification was a pure and homogeneous supercoiled (SC) pDNA with no gDNA, RNA and protein contamination as confirmed by ethidium bromide agarose gel electrophoresis (EtBr-AGE), enzyme restriction analysis and sodium dodecyl sulfate-polyacrylamide gel electrophoresis. This non-toxic technique is cGMP compatible and highly scalable for production of pDNA on a commercial level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic resonance is a well-established tool for structural characterisation of porous media. Features of pore-space morphology can be inferred from NMR diffusion-diffraction plots or the time-dependence of the apparent diffusion coefficient. Diffusion NMR signal attenuation can be computed from the restricted diffusion propagator, which describes the distribution of diffusing particles for a given starting position and diffusion time. We present two techniques for efficient evaluation of restricted diffusion propagators for use in NMR porous-media characterisation. The first is the Lattice Path Count (LPC). Its physical essence is that the restricted diffusion propagator connecting points A and B in time t is proportional to the number of distinct length-t paths from A to B. By using a discrete lattice, the number of such paths can be counted exactly. The second technique is the Markov transition matrix (MTM). The matrix represents the probabilities of jumps between every pair of lattice nodes within a single timestep. The propagator for an arbitrary diffusion time can be calculated as the appropriate matrix power. For periodic geometries, the transition matrix needs to be defined only for a single unit cell. This makes MTM ideally suited for periodic systems. Both LPC and MTM are closely related to existing computational techniques: LPC, to combinatorial techniques; and MTM, to the Fokker-Planck master equation. The relationship between LPC, MTM and other computational techniques is briefly discussed in the paper. Both LPC and MTM perform favourably compared to Monte Carlo sampling, yielding highly accurate and almost noiseless restricted diffusion propagators. Initial tests indicate that their computational performance is comparable to that of finite element methods. Both LPC and MTM can be applied to complicated pore-space geometries with no analytic solution. We discuss the new methods in the context of diffusion propagator calculation in porous materials and model biological tissues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We show that the cluster ion concentration (CIC) in the atmosphere is significantly suppressed during events that involve rapid increases in particle number concentration (PNC). Using a neutral cluster and air ion spectrometer, we investigated changes in CIC during three types of particle enhancement processes – new particle formation, a bushfire episode and an intense pyrotechnic display. In all three cases, the total CIC decreased with increasing PNC, with the rate of decrease being greater for negative CIC than positive. We attribute this to the greater mobility, and hence the higher attachment coefficient, of negative ions over positive ions in the air. During the pyrotechnic display, the rapid increase in PNC was sufficient to reduce the CIC of both polarities to zero. At the height of the display, the negative CIC stayed at zero for a full 10 min. Although the PNCs were not significantly different, the CIC during new particle formation did not decrease as much as during the bushfire episode and the pyrotechnic display. We suggest that the rate of increase of PNC, together with particle size, also play important roles in suppressing CIC in the atmosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bearing faults are the most common cause of wind turbine failures. Unavailability and maintenance cost of wind turbines are becoming critically important, with their fast growing in electric networks. Early fault detection can reduce outage time and costs. This paper proposes Anomaly Detection (AD) machine learning algorithms for fault diagnosis of wind turbine bearings. The application of this method on a real data set was conducted and is presented in this paper. For validation and comparison purposes, a set of baseline results are produced using the popular one-class SVM methods to examine the ability of the proposed technique in detecting incipient faults.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper explores a gap within the serious game design research. That gap is the ambiguity surrounding the process of aligning the instructional objectives of serious games with their core-gameplay i.e. the moment-to-moment activity that is the core of player interaction. A core-gameplay focused design framework is proposed that can work alongside existing, more broadly focused serious games design frameworks. The framework utilises an inquiry-based approach that allows the serious game designer to use key questions as a means to clearly outline instructional objectives with the core-gameplay. The use of this design framework is considered in the context of a small section of gameplay from an educational game currently in development. This demonstration of the framework brings shows how instructional objectives can be embedded into a serious games core-gameplay.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Water to air methane emissions from freshwater reservoirs can be dominated by sediment bubbling (ebullitive) events. Previous work to quantify methane bubbling from a number of Australian sub-tropical reservoirs has shown that this can contribute as much as 95% of total emissions. These bubbling events are controlled by a variety of different factors including water depth, surface and internal waves, wind seiching, atmospheric pressure changes and water levels changes. Key to quantifying the magnitude of this emission pathway is estimating both the bubbling rate as well as the areal extent of bubbling. Both bubbling rate and areal extent are seldom constant and require persistent monitoring over extended time periods before true estimates can be generated. In this paper we present a novel system for persistent monitoring of both bubbling rate and areal extent using multiple robotic surface chambers and adaptive sampling (grazing) algorithms to automate the quantification process. Individual chambers are self-propelled and guided and communicate between each other without the need for supervised control. They can maintain station at a sampling site for a desired incubation period and continuously monitor, record and report fluxes during the incubation. To exploit the methane sensor detection capabilities, the chamber can be automatically lowered to decrease the head-space and increase concentration. The grazing algorithms assign a hierarchical order to chambers within a preselected zone. Chambers then converge on the individual recording the highest 15 minute bubbling rate. Individuals maintain a specified distance apart from each other during each sampling period before all individuals are then required to move to different locations based on a sampling algorithm (systematic or adaptive) exploiting prior measurements. This system has been field tested on a large-scale subtropical reservoir, Little Nerang Dam, and over monthly timescales. Using this technique, localised bubbling zones on the water storage were found to produce over 50,000 mg m-2 d-1 and the areal extent ranged from 1.8 to 7% of the total reservoir area. The drivers behind these changes as well as lessons learnt from the system implementation are presented. This system exploits relatively cheap materials, sensing and computing and can be applied to a wide variety of aquatic and terrestrial systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Design Systematic review. Data sources The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. Selection criteria For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. Methods The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Results Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. Conclusions The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field.