989 resultados para Bouncing ball model


Relevância:

80.00% 80.00%

Publicador:

Resumo:

2016 is the outbreak year of the virtual reality industry. In the field of virtual reality, 3D surveying plays an important role. Nowadays, 3D surveying technology has received increasing attention. This project aims to establish and optimize a WebGL three-dimensional broadcast platform combined with streaming media technology. It takes streaming media server and panoramic video broadcast in browser as the application background. Simultaneously, it discusses about the architecture from streaming media server to panoramic media player and analyzing relevant theory problem. This paper focuses on the debugging of streaming media platform, the structure of WebGL player environment, different types of ball model analysis, and the 3D mapping technology. The main work contains the following points: Initially, relay on Easy Darwin open source streaming media server, built a streaming service platform. It can realize the transmission from RTSP stream to streaming media server, and forwards HLS slice video to clients; Then, wrote a WebGL panoramic video player based on Three.js lib with JQuery browser playback controls. Set up a HTML5 panoramic video player; Next, analyzed the latitude and longitude sphere model which from Three.js library according to WebGL rendering method. Pointed out the drawbacks of this model and the breakthrough point of improvement; After that, on the basis of Schneider transform principle, established the Schneider sphere projection model, and converted the output OBJ file to JS file for media player reading. Finally implemented real time panoramic video high precision playing without plugin; At last, I summarized the whole project. Put forward the direction of future optimization and extensible market.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bond's method for ball mill scale-up only gives the mill power draw for a given duty. This method is incompatible with computer modelling and simulation techniques. It might not be applicable for the design of fine grinding ball mills and ball mills preceded by autogenous and semi-autogenous grinding mills. Model-based ball mill scale-up methods have not been validated using a wide range of full-scale circuit data. Their accuracy is therefore questionable. Some of these methods also need expensive pilot testing. A new ball mill scale-up procedure is developed which does not have these limitations. This procedure uses data from two laboratory tests to determine the parameters of a ball mill model. A set of scale-up criteria then scales-up these parameters. The procedure uses the scaled-up parameters to simulate the steady state performance of full-scale mill circuits. At the end of the simulation, the scale-up procedure gives the size distribution, the volumetric flowrate and the mass flowrate of all the streams in the circuit, and the mill power draw.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new ball mill scale-up procedure is developed which uses laboratory data to predict the performance of MI-scale ball mill circuits. This procedure contains two laboratory tests. These laboratory tests give the data for the determination of the parameters of a ball mill model. A set of scale-up criteria then scales-up these parameters. The procedure uses the scaled-up parameters to simulate the steady state performance of the full-scale mill circuit. At the end of the simulation, the scale-up procedure gives the size distribution, the volumetric flowrate and the mass flowrate of all the streams in the circuit, and the mill power draw. A worked example shows how the new ball mill scale-up procedure is executed. This worked example uses laboratory data to predict the performance of a full-scale re-grind mill circuit. This circuit consists of a ball mill in closed circuit with hydrocyclones. The MI-scale ball mill has a diameter (inside liners) of 1.85m. The scale-up procedure shows that the full-scale circuit produces a product (hydrocyclone overflow) that has an 80% passing size of 80 mum. The circuit has a recirculating load of 173%. The calculated power draw of the full-scale mill is 92kW (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new ball mill scale-up procedure is developed. This procedure has been validated using seven sets of Ml-scale ball mil data. The largest ball mills in these data have diameters (inside liners) of 6.58m. The procedure can predict the 80% passing size of the circuit product to within +/-6% of the measured value, with a precision of +/-11% (one standard deviation); the re-circulating load to within +/-33% of the mass-balanced value (this error margin is within the uncertainty associated with the determination of the re-circulating load); and the mill power to within +/-5% of the measured value. This procedure is applicable for the design of ball mills which are preceded by autogenous (AG) mills, semi-autogenous (SAG) mills, crushers and flotation circuits. The new procedure is more precise and more accurate than Bond's method for ball mill scale-up. This procedure contains no efficiency correction which relates to the mill diameter. This suggests that, within the range of mill diameter studied, milling efficiency does not vary with mill diameter. This is in contrast with Bond's equation-Bond claimed that milling efficiency increases with mill diameter. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Bearing performance signi cantly a ects the dynamic behaviors and estimated working life of a rotating system. A common bearing type is the ball bearing, which has been under investigation in numerous published studies. The complexity of the ball bearing models described in the literature varies. Naturally, model complexity is related to computational burden. In particular, the inclusion of centrifugal forces and gyroscopic moments signi cantly increases the system degrees of freedom and lengthens solution time. On the other hand, for low or moderate rotating speeds, these e ects can be neglected without signi cant loss of accuracy. The objective of this paper is to present guidelines for the appropriate selection of a suitable bearing model for three case studies. To this end, two ball bearing models were implemented. One considers high-speed forces, and the other neglects them. Both models were used to study a three structures, and the simulation results were.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of the mechanisms of mechanical alloying requires knowledge of the impact characteristics between the ball and vial in the presence of milling powders. In this paper, foe falling experiments have br cn used to investigate the characteristics of impact events involved in mechanical milling. The effects of milling conditions, including impact velocity, ball size and powder thickness. on the coefficient of restitution and impact force are studied. It is found that the powder has a significant influence on the impact process due to its porous structure. This effect can be demonstrated using a modified Kelvin model. This study also confirms that the impact force is a relevant parameter for characterising the impact event due to its sensitivity to the milling conditions. (C) 1998 Elsevier Science S.A.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Although population viability analysis (PVA) is widely employed, forecasts from PVA models are rarely tested. This study in a fragmented forest in southern Australia contrasted field data on patch occupancy and abundance for the arboreal marsupial greater glider Petauroides volans with predictions from a generic spatially explicit PVA model. This work represents one of the first landscape-scale tests of its type. 2. Initially we contrasted field data from a set of eucalypt forest patches totalling 437 ha with a naive null model in which forecasts of patch occupancy were made, assuming no fragmentation effects and based simply on remnant area and measured densities derived from nearby unfragmented forest. The naive null model predicted an average total of approximately 170 greater gliders, considerably greater than the true count (n = 81). 3. Congruence was examined between field data and predictions from PVA under several metapopulation modelling scenarios. The metapopulation models performed better than the naive null model. Logistic regression showed highly significant positive relationships between predicted and actual patch occupancy for the four scenarios (P = 0.001-0.006). When the model-derived probability of patch occupancy was high (0.50-0.75, 0.75-1.00), there was greater congruence between actual patch occupancy and the predicted probability of occupancy. 4. For many patches, probability distribution functions indicated that model predictions for animal abundance in a given patch were not outside those expected by chance. However, for some patches the model either substantially over-predicted or under-predicted actual abundance. Some important processes, such as inter-patch dispersal, that influence the distribution and abundance of the greater glider may not have been adequately modelled. 5. Additional landscape-scale tests of PVA models, on a wider range of species, are required to assess further predictions made using these tools. This will help determine those taxa for which predictions are and are not accurate and give insights for improving models for applied conservation management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictions of flow patterns in a 600-mm scale model SAG mill made using four classes of discrete element method (DEM) models are compared to experimental photographs. The accuracy of the various models is assessed using quantitative data on shoulder, toe and vortex center positions taken from ensembles of both experimental and simulation results. These detailed comparisons reveal the strengths and weaknesses of the various models for simulating mills and allow the effect of different modelling assumptions to be quantitatively evaluated. In particular, very close agreement is demonstrated between the full 3D model (including the end wall effects) and the experiments. It is also demonstrated that the traditional two-dimensional circular particle DEM model under-predicts the shoulder, toe and vortex center positions and the power draw by around 10 degrees. The effect of particle shape and the dimensionality of the model are also assessed, with particle shape predominantly affecting the shoulder position while the dimensionality of the model affects mainly the toe position. Crown Copyright (C) 2003 Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thrust ball bearings lubricated with several different greases were tested on a modified Four-Ball Machine, where the Four-Ball arrangement was replaced by a bearing assembly. The friction torque and operating temperatures in a thrust ball bearing were measured during the tests. At the end of each test a grease sample was analyzed through ferrographic techniques in order to quantify and evaluate bearing wear. A rolling bearing friction torque model was used and the coefficient of friction in full film lubrication was determined for each grease, depending on the operating conditions. The experimental results obtained showed that grease formulation had a very significant influence on friction torque and operating temperature. The friction torque depends on the viscosity of the grease base oil, on its nature (mineral, ester, PAO, etc.), on the coefficient of friction in full film conditions, but also on the interaction between grease thickener and base oil, which affected contact replenishment and contact starvation, and thus influenced the friction torque.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discussion of possible scenarios for the future of Quality is on the priority list of major Quality Practitioners Societies. EOQ – European Organization for Quality (EOQ, 2014) main team for its 58th EOQ-Congress held June 2014 in Göteborg was “Managing Challenges in Quality Leadership” and ASQ - American Society for Quality (ASQ, 2015) appointed “the Future of Quality” for Quality Progress Magazine November 2015 issue. In addition, the ISO 9001:2008 revision process carried by ISO/TC 176 aims to assure that ISO 9001:2015 International Standard remains stable for the next 10 years (ISO, 2014) contributing to an increased discussion on the future of quality. The purpose of this research is to review available Quality Management approaches and outline, adding an academic perspective, expected developments for Quality within the 21st Century. This paper follows a qualitative approach, although data from international organizations is used. A literature review has been undertaken on quality management past and potential future trends. Based on these findings a model is proposed for organization quality management development and propositions for the future of quality management are advanced. Firstly, a state of the art of existing Quality Management approaches is presented, for example, like Total Quality Management (TQM) and Quality Gurus, ISO 9000 International Standards Series (with an outline of the expected changes for ISO 9001:2015), Six Sigma and Business Excellence Models.Secondly, building on theoretical and managerial approaches, a two dimensional matrix – Quality Engineering (QE - technical aspects of quality) and Quality Management (QM: soft aspects of quality) - is presented, outlining five proposed characterizations of Quality maturity levels and giving insights for applications and future developments. Literature review highlights that QM and QE may be addressing similar quality issues but their approaches are different in terms of scope breadth and intensity and they ought to complement and reciprocally reinforce one another. The challenges organizations face within the 21st century have stronger uncertainty, complexity, and differentiation. Two main propositions are advanced as relevant for 21st Century Quality: - QM importance for the sustainable success of organizations will increase and they should be aware of the larger ecosystem to be managed for improvement, possibly leading to the emergence of a new Quality paradigm, The Civilizacional Excellence paradigm. - QE should get more attention from QM and the Quality professionals will have to: a) Master and apply in wider contexts and in additional depth the Quality Tools (basic, intermediate and advanced); b) Have the soft skills needed for its success; c) Be results oriented and better understand and demonstrate the relationships between approaches and results These propositions challenge both scholars and practitioners for a sustained and supported discussion on the future of Quality. “All things are ready, if our mind be so.” (Shakespeare, Henry V, circa 1599).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We perform Monte-Carlo simulations of the three-dimensional Ising model at the critical temperature and zero magnetic field. We simulate the system in a ball with free boundary conditions on the two dimensional spherical boundary. Our results for one and two point functions in this geometry are consistent with the predictions from the conjectured conformal symmetry of the critical Ising model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: q-Space-based techniques such as diffusion spectrum imaging, q-ball imaging, and their variations have been used extensively in research for their desired capability to delineate complex neuronal architectures such as multiple fiber crossings in each of the image voxels. The purpose of this article was to provide an introduction to the q-space formalism and the principles of basic q-space techniques together with the discussion on the advantages as well as challenges in translating these techniques into the clinical environment. A review of the currently used q-space-based protocols in clinical research is also provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computed Tomography (CT) represents the standard imaging modality for tumor volume delineation for radiotherapy treatment planning of retinoblastoma despite some inherent limitations. CT scan is very useful in providing information on physical density for dose calculation and morphological volumetric information but presents a low sensitivity in assessing the tumor viability. On the other hand, 3D ultrasound (US) allows a highly accurate definition of the tumor volume thanks to its high spatial resolution but it is not currently integrated in the treatment planning but used only for diagnosis and follow-up. Our ultimate goal is an automatic segmentation of gross tumor volume (GTV) in the 3D US, the segmentation of the organs at risk (OAR) in the CT and the registration of both modalities. In this paper, we present some preliminary results in this direction. We present 3D active contour-based segmentation of the eye ball and the lens in CT images; the presented approach incorporates the prior knowledge of the anatomy by using a 3D geometrical eye model. The automated segmentation results are validated by comparing with manual segmentations. Then, we present two approaches for the fusion of 3D CT and US images: (i) landmark-based transformation, and (ii) object-based transformation that makes use of eye ball contour information on CT and US images.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For radiotherapy treatment planning of retinoblastoma inchildhood, Computed Tomography (CT) represents thestandard method for tumor volume delineation, despitesome inherent limitations. CT scan is very useful inproviding information on physical density for dosecalculation and morphological volumetric information butpresents a low sensitivity in assessing the tumorviability. On the other hand, 3D ultrasound (US) allows ahigh accurate definition of the tumor volume thanks toits high spatial resolution but it is not currentlyintegrated in the treatment planning but used only fordiagnosis and follow-up. Our ultimate goal is anautomatic segmentation of gross tumor volume (GTV) in the3D US, the segmentation of the organs at risk (OAR) inthe CT and the registration of both. In this paper, wepresent some preliminary results in this direction. Wepresent 3D active contour-based segmentation of the eyeball and the lens in CT images; the presented approachincorporates the prior knowledge of the anatomy by usinga 3D geometrical eye model. The automated segmentationresults are validated by comparing with manualsegmentations. Then, for the fusion of 3D CT and USimages, we present two approaches: (i) landmark-basedtransformation, and (ii) object-based transformation thatmakes use of eye ball contour information on CT and USimages.