879 resultados para Conflict-based method
Resumo:
Since the beginning of 3D computer vision problems, the use of techniques to reduce the data to make it treatable preserving the important aspects of the scene has been necessary. Currently, with the new low-cost RGB-D sensors, which provide a stream of color and 3D data of approximately 30 frames per second, this is getting more relevance. Many applications make use of these sensors and need a preprocessing to downsample the data in order to either reduce the processing time or improve the data (e.g., reducing noise or enhancing the important features). In this paper, we present a comparison of different downsampling techniques which are based on different principles. Concretely, five different downsampling methods are included: a bilinear-based method, a normal-based, a color-based, a combination of the normal and color-based samplings, and a growing neural gas (GNG)-based approach. For the comparison, two different models have been used acquired with the Blensor software. Moreover, to evaluate the effect of the downsampling in a real application, a 3D non-rigid registration is performed with the data sampled. From the experimentation we can conclude that depending on the purpose of the application some kernels of the sampling methods can improve drastically the results. Bilinear- and GNG-based methods provide homogeneous point clouds, but color-based and normal-based provide datasets with higher density of points in areas with specific features. In the non-rigid application, if a color-based sampled point cloud is used, it is possible to properly register two datasets for cases where intensity data are relevant in the model and outperform the results if only a homogeneous sampling is used.
Resumo:
Aim: The aim of this study was to assess the discriminatory power and potential turn around time ( TAT) of a PCR-based method for the detection of methicillin-resistant Staphylococcus aureus (MRSA) from screening swabs. Methods: Screening swabs were examined using the current laboratory protocol of direct culture on mannitol salt agar supplemented with oxacillin (MSAO-direct). The PCR method involved pre-incubation in broth for 4 hours followed by a multiplex PCR with primers directed to mecA and nuc genes of MRSA. The reference standard was determined by pre-incubation in broth for 4 hours followed by culture on MSAO (MSAO-broth). Results: A total of 256 swabs was analysed. The rates of detection of MRSA using MSAO-direct, MSAO-broth and PCR were 10.2, 13.3 and 10.2%, respectively. For PCR, the sensitivity, specificity, positive predictive value and negative predictive values were 66.7% (95% CI 51.9 - 83.3%), 98.6% ( 95% CI 97.1 - 100%), 84.6% ( 95% CI 76.2 - 100%) and 95.2% ( 95% CI 92.4 - 98.0%), respectively, and these results were almost identical to those obtained from MSAO-direct. The agreement between MSAO-direct and PCR was 61.5% ( 95% CI 42.8 - 80.2%) for positive results, 95.6% ( 95% CI 93.0 - 98.2%) for negative results and overall was 92.2% ( 95% CI 88.9 - 95.5%). Conclusions: ( 1) The discriminatory power of PCR and MSAO-direct is similar but the level of agreement, especially for true positive results, is low. ( 2) The potential TAT for the PCR method provides a marked advantage over conventional methods. ( 3) Further modifications to the PCR method such as increased broth incubation time, use of selective broth and adaptation to real-time PCR may lead to improvement in sensitivity and TAT.
Prediction of slurry transport in SAG mills using SPH fluid flow in a dynamic DEM based porous media
Resumo:
DEM modelling of the motion of coarse fractions of the charge inside SAG mills has now been well established for more than a decade. In these models the effect of slurry has broadly been ignored due to its complexity. Smoothed particle hydrodynamics (SPH) provides a particle based method for modelling complex free surface fluid flows and is well suited to modelling fluid flow in mills. Previous modelling has demonstrated the powerful ability of SPH to capture dynamic fluid flow effects such as lifters crashing into slurry pools, fluid draining from lifters, flow through grates and pulp lifter discharge. However, all these examples were limited by the ability to model only the slurry in the mill without the charge. In this paper, we represent the charge as a dynamic porous media through which the SPH fluid is then able to flow. The porous media properties (specifically the spatial distribution of porosity and velocity) are predicted by time averaging the mill charge predicted using a large scale DEM model. This allows prediction of transient and steady state slurry distributions in the mill and allows its variation with operating parameters, slurry viscosity and slurry volume, to be explored. (C) 2006 Published by Elsevier Ltd.
Resumo:
This study was undertaken to develop a simple laboratory-based method for simulating the freezing profiles of beef trim so that their effect on E. coli 0157 survival could be better assessed. A commercially available apparatus of the type used for freezing embryos, together with an associated temperature logger and software, was used for this purpose with a -80 degrees C freezer as a heat sink. Four typical beef trim freezing profiles, of different starting temperatures or lengths, were selected and modelled as straight lines for ease of manipulation. A further theoretical profile with an extended freezing plateau was also developed. The laboratory-based setup worked well and the modelled freezing profiles fitted closely to the original data. No change in numbers of any of the strains was apparent for the three simulated profiles of different lengths starting at 25 degrees C. Slight but significant (P < 0.05) decreases in numbers (similar to 0.2 log cfu g(-1)) of all strains were apparent for a profile starting at 12 degrees C. A theoretical version of this profile with a freezing plateau phase extended from 11 h to 17 h resulted in significant (P < 0.05) decreases in numbers (similar to 1.2 log cfu g(-1)) of all strains. Results indicated possible avenues for future research in controlling this pathogen. The method developed in this study proved a useful and cost-effective way for simulating freezing profiles of beef trim. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
A sensitive quantitative reversed-phase HPLC method is described for measuring bacterial proteolysis and proteinase activity in UHT milk. The analysis is performed on a TCA filtrate of the milk. The optimum concentration of TCA was found to be 4%; at lower concentrations, non-precipitated protein blocked the HPLC while higher concentrations yielded lower amounts of peptides. The method showed greater sensitivity and reproducibility than a fluorescamine-based method. Quantification of the HPLC method was achieved by use of an external dipeptide standard or a standard proteinase. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
We have developed an alignment-free method that calculates phylogenetic distances using a maximum-likelihood approach for a model of sequence change on patterns that are discovered in unaligned sequences. To evaluate the phylogenetic accuracy of our method, and to conduct a comprehensive comparison of existing alignment-free methods (freely available as Python package decaf+py at http://www.bioinformatics.org.au), we have created a data set of reference trees covering a wide range of phylogenetic distances. Amino acid sequences were evolved along the trees and input to the tested methods; from their calculated distances we infered trees whose topologies we compared to the reference trees. We find our pattern-based method statistically superior to all other tested alignment-free methods. We also demonstrate the general advantage of alignment-free methods over an approach based on automated alignments when sequences violate the assumption of collinearity. Similarly, we compare methods on empirical data from an existing alignment benchmark set that we used to derive reference distances and trees. Our pattern-based approach yields distances that show a linear relationship to reference distances over a substantially longer range than other alignment-free methods. The pattern-based approach outperforms alignment-free methods and its phylogenetic accuracy is statistically indistinguishable from alignment-based distances.
Resumo:
Background/aims Macular pigment is thought to protect the macula against exposure to light and oxidative stress, both of which may play a role in the development of age-related macular degeneration. The aim was to clinically evaluate a novel cathode-ray-tube-based method for measurement of macular pigment optical density (MPOD) known as apparent motion photometry (AMP). Methods The authors took repeat readings of MPOD centrally (0°) and at 3° eccentricity for 76 healthy subjects (mean (±SD) 26.5±13.2 years, range 18–74 years). Results The overall mean MPOD for the cohort was 0.50±0.24 at 0°, and 0.28±0.20 at 3° eccentricity; these values were significantly different (t=-8.905, p<0.001). The coefficients of repeatability were 0.60 and 0.48 for the 0 and 3° measurements respectively. Conclusions The data suggest that when the same operator is taking repeated 0° AMP MPOD readings over time, only changes of more than 0.60 units can be classed as clinically significant. In other words, AMP is not suitable for monitoring changes in MPOD over time, as increases of this magnitude would not be expected, even in response to dietary modification or nutritional supplementation.
Resumo:
We present a novel market-based method, inspired by retail markets, for resource allocation in fully decentralised systems where agents are self-interested. Our market mechanism requires no coordinating node or complex negotiation. The stability of outcome allocations, those at equilibrium, is analysed and compared for three buyer behaviour models. In order to capture the interaction between self-interested agents, we propose the use of competitive coevolution. Our approach is both highly scalable and may be tuned to achieve specified outcome resource allocations. We demonstrate the behaviour of our approach in simulation, where evolutionary market agents act on behalf of service providing nodes to adaptively price their resources over time, in response to market conditions. We show that this leads the system to the predicted outcome resource allocation. Furthermore, the system remains stable in the presence of small changes in price, when buyers' decision functions degrade gracefully. © 2009 The Author(s).
Resumo:
Online communities (OC) are an expanding social phenomenon gaining increasing interest from marketing practitioners. Community managers thus aim to increase OCs’ social capital. Diversity of individuals interacting in OCs provokes a lot of conflict. However, the influence of online conflict on OCs’ social capital is not clear as research indicates both positive and negative effects. The research aims to explain these contradictory effects by conceptualizing conflict as drama and developing a typology of online conflict. Based on netnographic investigations of a forum, four types of conflicts are thus distinguished depending on valence of emotions and the type of members involved. The research contributes to literature on OC dynamics and is of particular interest for community managers working in any company or organization.
Resumo:
We consider a finite state automata based method of solving a system of linear Diophantine equations with coefficients from the set {-1,0,1} and solutions in {0,1}.
Resumo:
The scope of this paper is to present the Pulse Width Modulation (PWM) based method for Active Power (AP) and Reactive Power (RP) measurements as can be applied in Power Meters. Necessarily, the main aim of the material presented is a twofold, first to present a realization methodology of the proposed algorithm, and second to verify the algorithm’s robustness and validity. The method takes advantage of the fact that frequencies present in a power line are of a specific fundamental frequency range (a range centred on the 50 Hz or 60 Hz) and that in case of the presence of harmonics the frequencies of those dominating in the power line spectrum can be specified on the basis of the fundamental. In contrast to a number of existing methods a time delay or shifting of the input signal is not required by the method presented and the time delay by n/2 of the Current signal with respect to the Voltage signal required by many of the existing measurement techniques, does not apply in the case of the PWM method as well.
Resumo:
Objective: Images on food and dietary supplement packaging might lead people to infer (appropriately or inappropriately) certain health benefits of those products. Research on this issue largely involves direct questions, which could (a) elicit inferences that would not be made unprompted, and (b) fail to capture inferences made implicitly. Using a novel memory-based method, in the present research, we explored whether packaging imagery elicits health inferences without prompting, and the extent to which these inferences are made implicitly. Method: In 3 experiments, participants saw fictional product packages accompanied by written claims. Some packages contained an image that implied a health-related function (e.g., a brain), and some contained no image. Participants studied these packages and claims, and subsequently their memory for seen and unseen claims were tested. Results: When a health image was featured on a package, participants often subsequently recognized health claims that—despite being implied by the image—were not truly presented. In Experiment 2, these recognition errors persisted despite an explicit warning against treating the images as informative. In Experiment 3, these findings were replicated in a large consumer sample from 5 European countries, and with a cued-recall test. Conclusion: These findings confirm that images can act as health claims, by leading people to infer health benefits without prompting. These inferences appear often to be implicit, and could therefore be highly pervasive. The data underscore the importance of regulating imagery on product packaging; memory-based methods represent innovative ways to measure how leading (or misleading) specific images can be. (PsycINFO Database Record (c) 2016 APA, all rights reserved)
Resumo:
Nowadays, with the use of social media generalizing, increasingly more people gather online to share their passion for specific consumption activities. Despite this shared passion, conflicts frequently erupt in online communities of consumption (OCC). A systematic review of the literature revealed that a lot of knowledge has developed on OCC conflict. Different types of conflicts unfolding in an OCC context have been distinguished, various drivers of conflict identified and various consequences outlined at the individual level (experiential value) and the community level (collective engagement and community culture). However the specificity of conflicts unfolding in an OCC context has not been conceptualized. Past research is also inconclusive as to where and when does OCC conflict create or destroy value in communities. This research provides a theory of OCC conflict and its impact on value formation by conceptualizing OCC conflict as performances. The theory was developed by conducting a netnography of a clubbing forum. Close to 20,000 forum posts and 250 pages of interview transcript and field notes were collected over 27 months and analysed following the principles of grounded theory. Four different types of conflict performances are distinguished (personal, played, reality show and trolling conflict) based on the clarity of the performance. Each type of conflict performance is positioned with regard to its roots and consequences for value formation. This research develops knowledge on disharmonious interactions in OCCs contributing to the development of a less utopian perspective of OCCs. It indicates how conflict is not only a byproduct of consumption but it is also a phenomenon consumed. It also introduces the concept of performance clarity to the literature on performance consumption. This research provides guidelines to community managers on how to manage conflict and raises ethical issues regarding the management of conflict on social media.
Resumo:
Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).
Resumo:
Recent technological developments have made it possible to design various microdevices where fluid flow and heat transfer are involved. For the proper design of such systems, the governing physics needs to be investigated. Due to the difficulty to study complex geometries in micro scales using experimental techniques, computational tools are developed to analyze and simulate flow and heat transfer in microgeometries. However, conventional numerical methods using the Navier-Stokes equations fail to predict some aspects of microflows such as nonlinear pressure distribution, increase mass flow rate, slip flow and temperature jump at the solid boundaries. This necessitates the development of new computational methods which depend on the kinetic theory that are both accurate and computationally efficient. In this study, lattice Boltzmann method (LBM) was used to investigate the flow and heat transfer in micro sized geometries. The LBM depends on the Boltzmann equation which is valid in the whole rarefaction regime that can be observed in micro flows. Results were obtained for isothermal channel flows at Knudsen numbers higher than 0.01 at different pressure ratios. LBM solutions for micro-Couette and micro-Poiseuille flow were found to be in good agreement with the analytical solutions valid in the slip flow regime (0.01 < Kn < 0.1) and direct simulation Monte Carlo solutions that are valid in the transition regime (0.1 < Kn < 10) for pressure distribution and velocity field. The isothermal LBM was further extended to simulate flows including heat transfer. The method was first validated for continuum channel flows with and without constrictions by comparing the thermal LBM results against accurate solutions obtained from analytical equations and finite element method. Finally, the capability of thermal LBM was improved by adding the effect of rarefaction and the method was used to analyze the behavior of gas flow in microchannels. The major finding of this research is that, the newly developed particle-based method described here can be used as an alternative numerical tool in order to study non-continuum effects observed in micro-electro-mechanical-systems (MEMS).