391 resultados para Size reduction
Resumo:
Background The use of dual growing rods is a fusionless surgical approach to the treatment of early onset scoliosis (EOS), which aims of harness potential growth in order to correct spinal deformity. The purpose of this study was to compare the in-vitro biomechanical response of two different dual rod designs under axial rotation loading. Methods Six porcine spines were dissected into seven level thoracolumbar multi-segmental units. Each specimen was mounted and tested in a biaxial Instron machine, undergoing nondestructive left/right axial rotation to peak moments of 4Nm at a constant rotation rate of 8deg.s-1. A motion tracking system (Optotrak) measured 3D displacements of individual vertebrae. Each spine was tested in an un-instrumented state first and then with appropriately sized semi-constrained growing rods and ‘rigid’ rods in alternating sequence. Range of motion, neutral zone size and stiffness were calculated from the moment-rotation curves and intervertebral ranges of motion were calculated from Optotrak data. Findings Irrespective of test sequence, rigid rods showed significantly reduction of total rotation across all instrumented levels (with increased stiffness) whilst semi-constrained rods exhibited similar rotation behavior to the un-instrumented (P<0.05). An 11% and 8% increase in stiffness for left and right axial rotation respectively and 15% reduction in total range of motion was recorded with dual rigid rods compared with semi-constrained rods. Interpretation Based on these findings, the semi-constrained growing rods do not increase axial rotation stiffness compared with un-instrumented spines. This is thought to provide a more physiological environment for the growing spine compared to dual rigid rod constructs.
Resumo:
This study provides validity evidence for the Capture-Recapture (CR) method, borrowed from ecology, as a measure of second language (L2) productive vocabulary size (PVS). Two separate “captures” of productive vocabulary were taken using written word association tasks (WAT). At Time 1, 47 bilinguals provided at least 4 associates to each of 30 high-frequency stimulus words in English, their first language (L1), and in French, their L2. A few days later (Time 2), this procedure was repeated with a different set of stimulus words in each language. Since the WAT was used, both Lex30 and CR PVS scores were calculated in each language. Participants also completed an animacy judgment task assessing the speed and efficiency of lexical access. Results indicated that, in both languages, CR and Lex30 scores were significantly positively correlated (evidence of convergent validity). CR scores were also significantly larger in the L1, and correlated significantly with the speed of lexical access in the L2 (evidence of construct validity). These results point to the validity of the technique for estimating relative L2 PVS. However, CR scores are not a direct indication of absolute vocabulary size. A discussion of the method’s underlying assumptions and their implications for interpretation are provided.
Resumo:
A modification to the PVA-FX hydrogel whereby the chelating agent, xylenol orange, was partially bonded to the gelling agent, poly-vinyl alcohol, resulted in an 8% reduction in the post irradiation Fe3+ diffusion, adding approximately 1 hour to the useful timespan between irradiation and readout. This xylenol orange functionalised poly-vinyl alcohol hydrogel had an OD dose sensitivity of 0.014 Gy−1 and a diffusion rate of 0.133 mm2 h−1. As this partial bond yields only incremental improvement, it is proposed that more efficient methods of bonding xylenol orange to poly-vinyl alcohol be investigated to further reduce the diffusion in Fricke gels.
Resumo:
This thesis examines the existing frameworks for energy management in the brewing industry and details the design, development and implementation of a new framework at a modern brewery. The aim of the research was to develop an energy management framework to identify opportunities in a systematic manner using Systems Engineering concepts and principles. This work led to a Sustainable Energy Management Framework, SEMF. Using the SEMF approach, one of Australia's largest breweries has achieved number 1 ranking in the world for water use for the production of beer and has also improved KPI's and sustained the energy management improvements that have been implemented during the past 15 years. The framework can be adapted to other manufacturing industries in the Australian context and is considered to be a new concept and a potentially important tool for energy management.
Resumo:
Multidimensional data are getting increasing attention from researchers for creating better recommender systems in recent years. Additional metadata provides algorithms with more details for better understanding the interaction between users and items. While neighbourhood-based Collaborative Filtering (CF) approaches and latent factor models tackle this task in various ways effectively, they only utilize different partial structures of data. In this paper, we seek to delve into different types of relations in data and to understand the interaction between users and items more holistically. We propose a generic multidimensional CF fusion approach for top-N item recommendations. The proposed approach is capable of incorporating not only localized relations of user-user and item-item but also latent interaction between all dimensions of the data. Experimental results show significant improvements by the proposed approach in terms of recommendation accuracy.
Resumo:
User profiling is the process of constructing user models which represent personal characteristics and preferences of customers. User profiles play a central role in many recommender systems. Recommender systems recommend items to users based on user profiles, in which the items can be any objects which the users are interested in, such as documents, web pages, books, movies, etc. In recent years, multidimensional data are getting more and more attention for creating better recommender systems from both academia and industry. Additional metadata provides algorithms with more details for better understanding the interactions between users and items. However, most of the existing user/item profiling techniques for multidimensional data analyze data through splitting the multidimensional relations, which causes information loss of the multidimensionality. In this paper, we propose a user profiling approach using a tensor reduction algorithm, which we will show is based on a Tucker2 model. The proposed profiling approach incorporates latent interactions between all dimensions into user profiles, which significantly benefits the quality of neighborhood formation. We further propose to integrate the profiling approach into neighborhoodbased collaborative filtering recommender algorithms. Experimental results show significant improvements in terms of recommendation accuracy.
Resumo:
Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.
Resumo:
The development of microfinance in Vietnam since 1990s has coincided with a remarkable progress in poverty reduction. Numerous descriptive studies have illustrated that microfinance is an effective tool to eradicate poverty in Vietnam but evidence from quantitative studies is mixed. This study contributes to the literature by providing new evidence on the impact of microfinance to poverty reduction in Vietnam using the repeated cross - sectional data from the Vietnam Living Standard s Survey (VLSS) during period 1992 - 2010. Our results show that micro - loans contribute significantly to household consumption.
Resumo:
Introduced predators can have pronounced effects on naïve prey species; thus, predator control is often essential for conservation of threatened native species. Complete eradication of the predator, although desirable, may be elusive in budget-limited situations, whereas predator suppression is more feasible and may still achieve conservation goals. We used a stochastic predator-prey model based on a Lotka-Volterra system to investigate the cost-effectiveness of predator control to achieve prey conservation. We compared five control strategies: immediate eradication, removal of a constant number of predators (fixed-number control), removal of a constant proportion of predators (fixed-rate control), removal of predators that exceed a predetermined threshold (upper-trigger harvest), and removal of predators whenever their population falls below a lower predetermined threshold (lower-trigger harvest). We looked at the performance of these strategies when managers could always remove the full number of predators targeted by each strategy, subject to budget availability. Under this assumption immediate eradication reduced the threat to the prey population the most. We then examined the effect of reduced management success in meeting removal targets, assuming removal is more difficult at low predator densities. In this case there was a pronounced reduction in performance of the immediate eradication, fixed-number, and lower-trigger strategies. Although immediate eradication still yielded the highest expected minimum prey population size, upper-trigger harvest yielded the lowest probability of prey extinction and the greatest return on investment (as measured by improvement in expected minimum population size per amount spent). Upper-trigger harvest was relatively successful because it operated when predator density was highest, which is when predator removal targets can be more easily met and the effect of predators on the prey is most damaging. This suggests that controlling predators only when they are most abundant is the "best" strategy when financial resources are limited and eradication is unlikely. © 2008 Society for Conservation Biology.
Resumo:
Biopharmaceuticals have been shown to have low delivery and transformation efficiencies. To over come this, larger doses are administered in order to obtain the desired response which may lead to toxicity and drug resistance. This paper reports upon a continuous particle production method utilizing surface acoustic wave atomization to reliably produce micro and nanoparticles with physical characteristics to facilitate the cellular uptake of biopharmaceuticals. By producing particles of an optimal size for cellular uptake, the efficacy and specificity of drug loaded nanoparticles will be increased. Better delivery methods will result in dosage reduction (hence lower costs per dose), reduced toxicity, and reduced problems associated with multidrug resistance due to over dosing.
Resumo:
The standard method for deciding bit-vector constraints is via eager reduction to propositional logic. This is usually done after first applying powerful rewrite techniques. While often efficient in practice, this method does not scale on problems for which top-level rewrites cannot reduce the problem size sufficiently. A lazy solver can target such problems by doing many satisfiability checks, each of which only reasons about a small subset of the problem. In addition, the lazy approach enables a wide range of optimization techniques that are not available to the eager approach. In this paper we describe the architecture and features of our lazy solver (LBV). We provide a comparative analysis of the eager and lazy approaches, and show how they are complementary in terms of the types of problems they can efficiently solve. For this reason, we propose a portfolio approach that runs a lazy and eager solver in parallel. Our empirical evaluation shows that the lazy solver can solve problems none of the eager solvers can and that the portfolio solver outperforms other solvers both in terms of total number of problems solved and the time taken to solve them.
Resumo:
Utilities worldwide are focused on supplying peak electricity demand reliably and cost effectively, requiring a thorough understanding of all the factors influencing residential electricity use at peak times. An electricity demand reduction project based on comprehensive residential consumer engagement was established within an Australian community in 2008, and by 2011, peak demand had decreased to below pre-intervention levels. This paper applied field data discovered through qualitative in-depth interviews of 22 residential households at the community to a Bayesian Network complex system model to examine whether the system model could explain successful peak demand reduction in the case study location. The knowledge and understanding acquired through insights into the major influential factors and the potential impact of changes to these factors on peak demand would underpin demand reduction intervention strategies for a wider target group.