994 resultados para interplay theory experiment


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mineral processing plants use two main processes; these are comminution and separation. The objective of the comminution process is to break complex particles consisting of numerous minerals into smaller simpler particles where individual particles consist primarily of only one mineral. The process in which the mineral composition distribution in particles changes due to breakage is called 'liberation'. The purpose of separation is to separate particles consisting of valuable mineral from those containing nonvaluable mineral. The energy required to break particles to fine sizes is expensive, and therefore the mineral processing engineer must design the circuit so that the breakage of liberated particles is reduced in favour of breaking composite particles. In order to effectively optimize a circuit through simulation it is necessary to predict how the mineral composition distributions change due to comminution. Such a model is called a 'liberation model for comminution'. It was generally considered that such a model should incorporate information about the ore, such as the texture. However, the relationship between the feed and product particles can be estimated using a probability method, with the probability being defined as the probability that a feed particle of a particular composition and size will form a particular product particle of a particular size and composition. The model is based on maximizing the entropy of the probability subject to mass constraints and composition constraint. Not only does this methodology allow a liberation model to be developed for binary particles, but also for particles consisting of many minerals. Results from applying the model to real plant ore are presented. A laboratory ball mill was used to break particles. The results from this experiment were used to estimate the kernel which represents the relationship between parent and progeny particles. A second feed, consisting primarily of heavy particles subsampled from the main ore was then ground through the same mill. The results from the first experiment were used to predict the product of the second experiment. The agreement between the predicted results and the actual results are very good. It is therefore recommended that more extensive validation is needed to fully evaluate the substance of the method. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The data structure of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. This research develops a methodology for evaluating, ex ante, the relative desirability of alternative data structures for end user queries. This research theorizes that the data structure that yields the lowest weighted average complexity for a representative sample of information requests is the most desirable data structure for end user queries. The theory was tested in an experiment that compared queries from two different relational database schemas. As theorized, end users querying the data structure associated with the less complex queries performed better Complexity was measured using three different Halstead metrics. Each of the three metrics provided excellent predictions of end user performance. This research supplies strong evidence that organizations can use complexity metrics to evaluate, ex ante, the desirability of alternate data structures. Organizations can use these evaluations to enhance the efficient and effective retrieval of information by creating data structures that minimize end user query complexity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The schema of an information system can significantly impact the ability of end users to efficiently and effectively retrieve the information they need. Obtaining quickly the appropriate data increases the likelihood that an organization will make good decisions and respond adeptly to challenges. This research presents and validates a methodology for evaluating, ex ante, the relative desirability of alternative instantiations of a model of data. In contrast to prior research, each instantiation is based on a different formal theory. This research theorizes that the instantiation that yields the lowest weighted average query complexity for a representative sample of information requests is the most desirable instantiation for end-user queries. The theory was validated by an experiment that compared end-user performance using an instantiation of a data structure based on the relational model of data with performance using the corresponding instantiation of the data structure based on the object-relational model of data. Complexity was measured using three different Halstead metrics: program length, difficulty, and effort. For a representative sample of queries, the average complexity using each instantiation was calculated. As theorized, end users querying the instantiation with the lower average complexity made fewer semantic errors, i.e., were more effective at composing queries. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the absence of an external frame of reference-i.e., in background independent theories such as general relativity-physical degrees of freedom must describe relations between systems. Using a simple model, we investigate how such a relational quantum theory naturally arises by promoting reference systems to the status of dynamical entities. Our goal is twofold. First, we demonstrate using elementary quantum theory how any quantum mechanical experiment admits a purely relational description at a fundamental. Second, we describe how the original non-relational theory approximately emerges from the fully relational theory when reference systems become semi-classical. Our technique is motivated by a Bayesian approach to quantum mechanics, and relies on the noiseless subsystem method of quantum information science used to protect quantum states against undesired noise. The relational theory naturally predicts a fundamental decoherence mechanism, so an arrow of time emerges from a time-symmetric theory. Moreover, our model circumvents the problem of the collapse of the wave packet as the probability interpretation is only ever applied to diagonal density operators. Finally, the physical states of the relational theory can be described in terms of spin networks introduced by Penrose as a combinatorial description of geometry, and widely studied in the loop formulation of quantum gravity. Thus, our simple bottom-up approach (starting from the semiclassical limit to derive the fully relational quantum theory) may offer interesting insights on the low energy limit of quantum gravity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes an experiment in the design of distributed programs. It is based on the theory of Owicki and Gries extended with rules for reasoning about message passing. The experiment is designed to test the effectiveness of the extended theory for designing distributed programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis work we develop a new generative model of social networks belonging to the family of Time Varying Networks. The importance of correctly modelling the mechanisms shaping the growth of a network and the dynamics of the edges activation and inactivation are of central importance in network science. Indeed, by means of generative models that mimic the real-world dynamics of contacts in social networks it is possible to forecast the outcome of an epidemic process, optimize the immunization campaign or optimally spread an information among individuals. This task can now be tackled taking advantage of the recent availability of large-scale, high-quality and time-resolved datasets. This wealth of digital data has allowed to deepen our understanding of the structure and properties of many real-world networks. Moreover, the empirical evidence of a temporal dimension in networks prompted the switch of paradigm from a static representation of graphs to a time varying one. In this work we exploit the Activity-Driven paradigm (a modeling tool belonging to the family of Time-Varying-Networks) to develop a general dynamical model that encodes fundamental mechanism shaping the social networks' topology and its temporal structure: social capital allocation and burstiness. The former accounts for the fact that individuals does not randomly invest their time and social interactions but they rather allocate it toward already known nodes of the network. The latter accounts for the heavy-tailed distributions of the inter-event time in social networks. We then empirically measure the properties of these two mechanisms from seven real-world datasets and develop a data-driven model, analytically solving it. We then check the results against numerical simulations and test our predictions with real-world datasets, finding a good agreement between the two. Moreover, we find and characterize a non-trivial interplay between burstiness and social capital allocation in the parameters phase space. Finally, we present a novel approach to the development of a complete generative model of Time-Varying-Networks. This model is inspired by the Kaufman's adjacent possible theory and is based on a generalized version of the Polya's urn. Remarkably, most of the complex and heterogeneous feature of real-world social networks are naturally reproduced by this dynamical model, together with many high-order topological properties (clustering coefficient, community structure etc.).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brand extensions are increasingly used by multinational corporations in emerging markets such as China. However, understanding how consumers in the emerging markets evaluate brand extensions is hampered by a lack of research in the emerging markets contexts. To address the knowledge void, we built on an established brand extension evaluation framework in the West, namely Aaker and Keller (1990)1. Aaker , D. A. and Keller , K. L. 1990 . Consumer evaluations of brand extensions . Journal of Marketing , 54 ( 1 ) : 27 – 41 . [CrossRef], [Web of Science ®] View all references, and extended the model by incorporating two new factors: perceived fit based on brand image consistency and competition intensity in the brand extension category. The additions of two factors are made in recognition of the uniqueness of the considerations of consumers in the emerging markets in their brand extension evaluations. The extended model was tested by an empirical experiment using consumers in China. The results partly validated the Aaker and Keller model, and evidence that both newly added factors were significant in influencing consumers' evaluation of brand extensions was also found. More important, one new factor proposed, namely, consumer-perceived fit based on brand image consistency, was found to be more significant than all the factors in Aaker and Keller's original model, suggesting that the Aaker and Keller model may be limited in explaining how consumers in the emerging markets evaluate brand extensions. Further research implications and limitations are discussed in the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over recent years, evidence has been accumulating in favour of the importance of long-term information as a variable which can affect the success of short-term recall. Lexicality, word frequency, imagery and meaning have all been shown to augment short term recall performance. Two competing theories as to the causes of this long-term memory influence are outlined and tested in this thesis. The first approach is the order-encoding account, which ascribes the effect to the usage of resources at encoding, hypothesising that word lists which require less effort to process will benefit from increased levels of order encoding, in turn enhancing recall success. The alternative view, trace redintegration theory, suggests that order is automatically encoded phonologically, and that long-term information can only influence the interpretation of the resultant memory trace. The free recall experiments reported here attempted to determine the importance of order encoding as a facilitatory framework and to determine the locus of the effects of long-term information in free recall. Experiments 1 and 2 examined the effects of word frequency and semantic categorisation over a filled delay, and experiments 3 and 4 did the same for immediate recall. Free recall was improved by both long-term factors tested. Order information was not used over a short filled delay, but was evident in immediate recall. Furthermore, it was found that both long-term factors increased the amount of order information retained. Experiment 5 induced an order encoding effect over a filled delay, leaving a picture of short-term processes which are closely associated with long-term processes, and which fit conceptions of short-term memory being part of language processes rather better than either the encoding or the retrieval-based models. Experiments 6 and 7 aimed to determine to what extent phonological processes were responsible for the pattern of results observed. Articulatory suppression affected the encoding of order information where speech rate had no direct influence, suggesting that it is ease of lexical access which is the most important factor in the influence of long-term memory on immediate recall tasks. The evidence presented in this thesis does not offer complete support for either the retrieval-based account or the order encoding account of long-term influence. Instead, the evidence sits best with models that are based upon language-processing. The path urged for future research is to find ways in which this diffuse model can be better specified, and which can take account of the versatility of the human brain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-similar optical pulses (or “similaritons”) of parabolic intensity profile can be found as asymptotic solutions of the nonlinear Schr¨odinger equation in a gain medium such as a fiber amplifier or laser resonator. These solutions represent a wide-ranging significance example of dissipative nonlinear structures in optics. Here, we address some issues related to the formation and evolution of parabolic pulses in a fiber gain medium by means of semi-analytic approaches. In particular, the effect of the third-order dispersion on the structure of the asymptotic solution is examined. Our analysis is based on the resolution of ordinary differential equations, which enable us to describe the main properties of the pulse propagation and structural characteristics observable through direct numerical simulations of the basic partial differential equation model with sufficient accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-similar optical pulses (or “similaritons”) of parabolic intensity profile can be found as asymptotic solutions of the nonlinear Schr¨odinger equation in a gain medium such as a fiber amplifier or laser resonator. These solutions represent a wide-ranging significance example of dissipative nonlinear structures in optics. Here, we address some issues related to the formation and evolution of parabolic pulses in a fiber gain medium by means of semi-analytic approaches. In particular, the effect of the third-order dispersion on the structure of the asymptotic solution is examined. Our analysis is based on the resolution of ordinary differential equations, which enable us to describe the main properties of the pulse propagation and structural characteristics observable through direct numerical simulations of the basic partial differential equation model with sufficient accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the growing number of wrongful convictions involving faulty eyewitness evidence and the strong reliance by jurors on eyewitness testimony, researchers have sought to develop safeguards to decrease erroneous identifications. While decades of eyewitness research have led to numerous recommendations for the collection of eyewitness evidence, less is known regarding the psychological processes that govern identification responses. The purpose of the current research was to expand the theoretical knowledge of eyewitness identification decisions by exploring two separate memory theories: signal detection theory and dual-process theory. This was accomplished by examining both system and estimator variables in the context of a novel lineup recognition paradigm. Both theories were also examined in conjunction with confidence to determine whether it might add significantly to the understanding of eyewitness memory. ^ In two separate experiments, both an encoding and a retrieval-based manipulation were chosen to examine the application of theory to eyewitness identification decisions. Dual-process estimates were measured through the use of remember-know judgments (Gardiner & Richardson-Klavehn, 2000). In Experiment 1, the effects of divided attention and lineup presentation format (simultaneous vs. sequential) were examined. In Experiment 2, perceptual distance and lineup response deadline were examined. Overall, the results indicated that discrimination and remember judgments (recollection) were generally affected by variations in encoding quality and response criterion and know judgments (familiarity) were generally affected by variations in retrieval options. Specifically, as encoding quality improved, discrimination ability and judgments of recollection increased; and as the retrieval task became more difficult there was a shift toward lenient choosing and more reliance on familiarity. ^ The application of signal detection theory and dual-process theory in the current experiments produced predictable results on both system and estimator variables. These theories were also compared to measures of general confidence, calibration, and diagnosticity. The application of the additional confidence measures in conjunction with signal detection theory and dual-process theory gave a more in-depth explanation than either theory alone. Therefore, the general conclusion is that eyewitness identifications can be understood in a more complete manor by applying theory and examining confidence. Future directions and policy implications are discussed. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

English has been taught as a core and compulsory subject in China for decades. Recently, the demand for English in China has increased dramatically. China now has the world's largest English-learning population. The traditional English-teaching method cannot continue to be the only approach because it merely focuses on reading, grammar and translation, which cannot meet English learners and users' needs (i.e., communicative competence and skills in speaking and writing). ^ This study was conducted to investigate if the Picture-Word Inductive Model (PWIM), a new pedagogical method using pictures and inductive thinking, would benefit English learners in China in terms of potential higher output in speaking and writing. With the gauge of Cognitive Load Theory (CLT), specifically, its redundancy effect, I investigated whether processing words and a picture concurrently would present a cognitive overload for English learners in China. ^ I conducted a mixed methods research study. A quasi-experiment (pretest, intervention for seven weeks, and posttest) was conducted using 234 students in four groups in Lianyungang, China (58 fourth graders and 57 seventh graders as an experimental group with PWIM and 59 fourth graders and 60 seventh graders as a control group with the traditional method). No significant difference in the effects of PWIM was found on vocabulary acquisition based on grade levels. Observations, questionnaires with open-ended questions, and interviews were deployed to answer the three remaining research questions. A few students felt cognitively overloaded when they encountered too many writing samples, too many new words at one time, repeated words, mismatches between words and pictures, and so on. Many students listed and exemplified numerous strengths of PWIM, but a few mentioned weaknesses of PWIM. The students expressed the idea that PWIM had a positive effect on their English teaching. ^ As integrated inferences, qualitative findings were used to explain the quantitative results that there were no significant differences of the effects of the PWIM between the experimental and control groups in both grade levels, from four contextual aspects: time constraints on PWIM implementation, teachers' resistance, how to use PWIM and PWIM implemented in a classroom over 55 students.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the fact that ocean acidification is considered to be especially pronounced in the Southern Ocean, little is known about CO2-dependent physiological processes and the interactions of Antarctic phytoplankton key species. We therefore studied the effects of CO2 partial pressure (PCO2) (16.2, 39.5, and 101.3 Pa) on growth and photosynthetic carbon acquisition in the bloom-forming species Chaetoceros debilis, Pseudo-nitzschia subcurvata, Fragilariopsis kerguelensis, and Phaeocystis antarctica. Using membrane-inlet mass spectrometry, photosynthetic O2 evolution and inorganic carbon (Ci) fluxes were determined as a function of CO2 concentration. Only the growth of C. debilis was enhanced under high PCO2. Analysis of the carbon concentrating mechanism (CCM) revealed the operation of very efficient CCMs (i.e., high Ci affinities) in all species, but there were species-specific differences in CO2-dependent regulation of individual CCM components (i.e., CO2 and uptake kinetics, carbonic anhydrase activities). Gross CO2 uptake rates appear to increase with the cell surface area to volume ratios. Species competition experiments with C. debilis and P. subcurvata under different PCO2 levels confirmed the CO2-stimulated growth of C. debilis observed in monospecific incubations, also in the presence of P. subcurvata. Independent of PCO2, high initial cell abundances of P. subcurvata led to reduced growth rates of C. debilis. For a better understanding of future changes in phytoplankton communities, CO2-sensitive physiological processes need to be identified, but also species interactions must be taken into account because their interplay determines the success of a species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present theory, numerical simulations and experimental observations of a 1D optical wave system. We show that this system is of a dual cascade type, namely, the energy cascading directly to small scales, and the photons or wave action cascading to large scales. In the optical context the inverse cascade is particularly interesting because it means the condensation of photons. We show that the cascades are induced by a six-wave resonant interaction process described by weak turbulence theory. We show that by starting with weakly nonlinear randomized waves as an initial condition, there exists an inverse cascade of photons towards the lowest wavenumbers. During the cascade nonlinearity becomes strong at low wavenumbers and, due to the focusing nature of the nonlinearity, it leads to modulational instability resulting in the formation of solitons. Further interaction of the solitons among themselves and with incoherent waves leads to the final condensate state dominated by a single strong soliton. In addition, we show the existence of the direct energy cascade numerically and that it agrees with the wave turbulence prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

English has been taught as a core and compulsory subject in China for decades. Recently, the demand for English in China has increased dramatically. China now has the world’s largest English-learning population. The traditional English-teaching method cannot continue to be the only approach because it merely focuses on reading, grammar and translation, which cannot meet English learners and users’ needs (i.e., communicative competence and skills in speaking and writing). This study was conducted to investigate if the Picture-Word Inductive Model (PWIM), a new pedagogical method using pictures and inductive thinking, would benefit English learners in China in terms of potential higher output in speaking and writing. With the gauge of Cognitive Load Theory (CLT), specifically, its redundancy effect, I investigated whether processing words and a picture concurrently would present a cognitive overload for English learners in China. I conducted a mixed methods research study. A quasi-experiment (pretest, intervention for seven weeks, and posttest) was conducted using 234 students in four groups in Lianyungang, China (58 fourth graders and 57 seventh graders as an experimental group with PWIM and 59 fourth graders and 60 seventh graders as a control group with the traditional method). No significant difference in the effects of PWIM was found on vocabulary acquisition based on grade levels. Observations, questionnaires with open-ended questions, and interviews were deployed to answer the three remaining research questions. A few students felt cognitively overloaded when they encountered too many writing samples, too many new words at one time, repeated words, mismatches between words and pictures, and so on. Many students listed and exemplified numerous strengths of PWIM, but a few mentioned weaknesses of PWIM. The students expressed the idea that PWIM had a positive effect on their English teaching. As integrated inferences, qualitative findings were used to explain the quantitative results that there were no significant differences of the effects of the PWIM between the experimental and control groups in both grade levels, from four contextual aspects: time constraints on PWIM implementation, teachers’ resistance, how to use PWIM and PWIM implemented in a classroom over 55 students.