963 resultados para continuous-time models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The freezing times of fruit pulp models packed and conditioned in multi-layered boxes were evaluated under conditions similar to those employed commercially. Estimating the freezing time is a difficult practice due to the presence of significant voids in the boxes, whose influence may be analyzed by means of various methods. In this study, a procedure for estimating freezing time by using the models described in the literature was compared with experimental measurements by collecting time/temperature data. The following results show that the airflow through packages is a significant parameter for freezing time estimation. When the presence of preferential channels was considered, the predicted freezing time in the models could be 10% lower than the experimental values, depending on the method. The isotherms traced as a function of the location of the samples inside the boxes showed the displacement of the thermal center in relation to the geometric center of the product.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The costs of health care are going up in many countries. In order to provide affordable and effective health care solutions, new technologies and approaches are constantly being developed. In this research, video games are presented as a possible solution to the problem. Video games are fun, and nowadays most people like to spend time on them. In addition, recent studies have pointed out that video games can have notable health benefits. Health games have already been developed, used in practice, and researched. However, the bulk of health game studies have been concerned with the design or the effectiveness of the games; no actual business studies have been conducted on the subject, even though health games often lack commercial success despite their health benefits. This thesis seeks to fill this gap. The specific aim of this thesis is to develop a conceptual business model framework and empirically use it in explorative medical game business model research. In the first stage of this research, a literature review was conducted and the existing literature analyzed and synthesized into a conceptual business model framework consisting of six dimensions. The motivation behind the synthesis is the ongoing ambiguity around the business model concept. In the second stage, 22 semi-structured interviews were conducted with different professionals within the value network for medical games. The business model framework was present in all stages of the empirical research: First, in the data collection stage, the framework acted as a guiding instrument, focusing the interview process. Then, the interviews were coded and analyzed using the framework as a structure. The results were then reported following the structure of the framework. In the results, the interviewees highlighted several important considerations and issues for medical games concerning the six dimensions of the business model framework. Based on the key findings of this research, several key components of business models for medical games were identified and illustrated in a single figure. Furthermore, five notable challenges for business models for medical games were presented, and possible solutions for the challenges were postulated. Theoretically, these findings provide pioneering information on the untouched subject of business models for medical games. Moreover, the conceptual business model framework and its use in the novel context of medical games provide a contribution to the business model literature. Regarding practice, this thesis further accentuates that medical games can offer notable benefits to several stakeholder groups and offers advice to companies seeking to commercialize these games.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solvent extraction of calcium and magnesium impurities from a lithium-rich brine (Ca ~ 2,000 ppm, Mg ~ 50 ppm, Li ~ 30,000 ppm) was investigated using a continuous counter-current solvent extraction mixer-settler set-up. The literature review includes a general review about resources, demands and production methods of Li followed by basics of solvent extraction. Experimental section includes batch experiments for investigation of pH isotherms of three extractants; D2EHPA, Versatic 10 and LIX 984 with concentrations of 0.52, 0.53 and 0.50 M in kerosene respectively. Based on pH isotherms LIX 984 showed no affinity for solvent extraction of Mg and Ca at pH ≤ 8 while D2EHPA and Versatic 10 were effective in extraction of Ca and Mg. Based on constructed pH isotherms, loading isotherms of D2EHPA (at pH 3.5 and 3.9) and Versatic 10 (at pH 7 and 8) were further investigated. Furthermore based on McCabe-Thiele method, two extraction stages and one stripping stage (using HCl acid with concentration of 2 M for Versatic 10 and 3 M for D2EHPA) was practiced in continuous runs. Merits of Versatic 10 in comparison to D2EHPA are higher selectivity for Ca and Mg, faster phase disengagement, no detrimental change in viscosity due to shear amount of metal extraction and lower acidity in stripping. On the other hand D2EHPA has less aqueous solubility and is capable of removing Mg and Ca simultaneously even at higher Ca loading (A/O in continuous runs > 1). In general, shorter residence time (~ 2 min), lower temperature (~23 °C), lower pH values (6.5-7.0 for Versatic 10 and 3.5-3.7 for D2EHPA) and a moderately low A/O value (< 1:1) would cause removal of 100% of Ca and nearly 100% of Mg while keeping Li loss less than 4%, much lower than the conventional precipitation in which 20% of Li is lost.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Freezing of poultry cuts in continuous convective air blast tunnels is normally performed with the products protected by Low Density Polyethylene (LDPE) as a primary packaging and using Corrugated Cardboard Boxes (CCB) as secondary packaging. The objective of this work was to investigate the influence of these secondary packaging on the freezing of poultry cuts in continuous convective air blast tunnels. The study was performed by replacing CCB with Perforated Metal Boxes (PMB) in order to remove the packaging thermal resistance. The assays, performed in a industrial plant, demonstrated that CCB used commercially for meat freezing have a high heat transfer resistance. Their replacement with PMB can lead to shorter freezing times and spatially homogeneous freezing. Reductions of up to 45% in the freezing times were observed using PMB. The plateau of the temperature curve, related to the freezing time of free water, was significantly reduced using PMB, which is accepted to lead to better product quality after thawing. As the products were protected by the LDPE films as primary packaging, their appearance were not affected. The results presented in this work indicate that replacing CBB with PMB can be an excellent alternative to reduce freezing time and improve freezing homogeneity in industrial air blast tunnels, which could also be applied to other products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Restructuring by adding Sodium Alginate or Microbial Transglutaminase (MTGase) using cold gelation technology make it possible to obtain many different raw products from minced and/or chopped fish muscle that are suitable for being used as the basis of new restructured products with different physicochemical properties and even different compositions. Special consideration must be given to their shelf-life and the changes that may take place during chilling, both in visual appearance and physicochemical properties. After chilled storage, the restructured models made with different muscular particle size and composition at low temperature (5 °C), it was observed that microbial growth limited the shelf-life to 7-14 days. Mechanical properties increased (p < 0.05) during that time, and higher values were observed in samples elaborated by joining small muscle particle size than in those elaborated by homogenization. There was no clear increase in the cooking yield and purge loss, and no significant colour change (p > 0.05) was detected during storage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractThe combined effects of tumbling marination methods (Vacuum continuous tumbling marination, CT; Vacuum intermittent tumbling marination, IT) and effective tumbling time (4, 6, 8 and 10 h) on quality characteristics of prepared boneless pork chops were investigated. The results showed that regardless of tumbling time, CT method significantly increased the pH, product yield, cohesiveness, resilience, sensory tenderness and overall flavor (p<0.05) compared with IT method, and CT method also significantly decreased the pressing loss, cooking loss, shear force value (SFV), hardness and chewiness (p<0.05) compared with IT method. With the effective tumbling time increasing from 4 h to 10 h, the product yield and sensory attributes of prepared pork chops increased at first and then decreased, whereas the pressing loss, cooking loss, SFV, hardness and chewiness decreased at first and then increased. Additionally, an interaction between CT method and effective tumbling time was also observed. These results suggested that CT method of 8 h obtained the best quality characteristics of prepared pork chops, which should be adopted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to propose a stochastic model for commodity markets linked with the Burgers equation from fluid dynamics. We construct a stochastic particles method for commodity markets, in which particles represent market participants. A discontinuity in the model is included through an interacting kernel equal to the Heaviside function and its link with the Burgers equation is given. The Burgers equation and the connection of this model with stochastic differential equations are also studied. Further, based on the law of large numbers, we prove the convergence, for large N, of a system of stochastic differential equations describing the evolution of the prices of N traders to a deterministic partial differential equation of Burgers type. Numerical experiments highlight the success of the new proposal in modeling some commodity markets, and this is confirmed by the ability of the model to reproduce price spikes when their effects occur in a sufficiently long period of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this qualitative study was to understand the relationships between creativity and the working artist/teacher employed by an art college. The topic emerged from my job as an instructor at The Ontario College of Art which was used as the primary data resource and provided the highest caliber of professionals to chose from. Existent data were used to facilitate the study generated by the research of Cawelti, Rappaport, and Wood (1992). The data were generated by a group of 5 faculty members from The University of Northern Iowa, recognized for their expertise in the arts (a painter, a poet, a sculptor, a novelist, and a photographer). They were asked to respond to the following statement: "In as much detail as you like, list the things that you did, thought, or felt the last time you created an artistic product. II Cawelti, Rappaport, and Wood (1992) produced three models of the creative process, each building on the previous, with the resultant third,being in my opinion, an excellent illustration (text/visual) of the creative process. Model three (Appendix D) presented a "multi-dimensional view of the creative process: time, space, observatility, and consciousnessll (p. 90). Model three utilized a visual mapping device along the bottom of the page linked to text segments above. Both the visual and the text were interrelated so that they harmonized into a comprehensive "picture." The parti'cipants of this qualitative study were asked to consider model three from their professional perspective as artist/teachers. The interpretive sciences directed the methodology. The hermeneutic circle of continuous reflection from the whole to the part and back to the whole was an important aspect of the data analyses. Four members of the Foundation Department at The Ontario College of Art were the key participants. A series of conversational interviews was the primary source of data collection, this was augmented by observation, fie,ldnotes, and follow up telephone interviews. Transcripts of interviews were returned to participants for reflection and the telephone was used to discuss any additional -points raised. Analysis consisted of coding and organizing data according to emerging themes. These themes formed the basis for the narrative stories. The text of the narrative stories were given back to each participant for further comment. Revisions were made until both the researcher and the participants felt that the stories reflected reality. The resultant whole was critiqued from the researcher's perspective. The significance of this study was discussed as it pertains to the working artist/teacher and areas in need of further study are pointed out.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A feature-based fitness function is applied in a genetic programming system to synthesize stochastic gene regulatory network models whose behaviour is defined by a time course of protein expression levels. Typically, when targeting time series data, the fitness function is based on a sum-of-errors involving the values of the fluctuating signal. While this approach is successful in many instances, its performance can deteriorate in the presence of noise. This thesis explores a fitness measure determined from a set of statistical features characterizing the time series' sequence of values, rather than the actual values themselves. Through a series of experiments involving symbolic regression with added noise and gene regulatory network models based on the stochastic 'if-calculus, it is shown to successfully target oscillating and non-oscillating signals. This practical and versatile fitness function offers an alternate approach, worthy of consideration for use in algorithms that evaluate noisy or stochastic behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If you want to know whether a property is true or not in a specific algebraic structure,you need to test that property on the given structure. This can be done by hand, which can be cumbersome and erroneous. In addition, the time consumed in testing depends on the size of the structure where the property is applied. We present an implementation of a system for finding counterexamples and testing properties of models of first-order theories. This system is supposed to provide a convenient and paperless environment for researchers and students investigating or studying such models and algebraic structures in particular. To implement a first-order theory in the system, a suitable first-order language.( and some axioms are required. The components of a language are given by a collection of variables, a set of predicate symbols, and a set of operation symbols. Variables and operation symbols are used to build terms. Terms, predicate symbols, and the usual logical connectives are used to build formulas. A first-order theory now consists of a language together with a set of closed formulas, i.e. formulas without free occurrences of variables. The set of formulas is also called the axioms of the theory. The system uses several different formats to allow the user to specify languages, to define axioms and theories and to create models. Besides the obvious operations and tests on these structures, we have introduced the notion of a functor between classes of models in order to generate more co~plex models from given ones automatically. As an example, we will use the system to create several lattices structures starting from a model of the theory of pre-orders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex networks can arise naturally and spontaneously from all things that act as a part of a larger system. From the patterns of socialization between people to the way biological systems organize themselves, complex networks are ubiquitous, but are currently poorly understood. A number of algorithms, designed by humans, have been proposed to describe the organizational behaviour of real-world networks. Consequently, breakthroughs in genetics, medicine, epidemiology, neuroscience, telecommunications and the social sciences have recently resulted. The algorithms, called graph models, represent significant human effort. Deriving accurate graph models is non-trivial, time-intensive, challenging and may only yield useful results for very specific phenomena. An automated approach can greatly reduce the human effort required and if effective, provide a valuable tool for understanding the large decentralized systems of interrelated things around us. To the best of the author's knowledge this thesis proposes the first method for the automatic inference of graph models for complex networks with varied properties, with and without community structure. Furthermore, to the best of the author's knowledge it is the first application of genetic programming for the automatic inference of graph models. The system and methodology was tested against benchmark data, and was shown to be capable of reproducing close approximations to well-known algorithms designed by humans. Furthermore, when used to infer a model for real biological data the resulting model was more representative than models currently used in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This lexical decision study with eye tracking of Japanese two-kanji-character words investigated the order in which a whole two-character word and its morphographic constituents are activated in the course of lexical access, the relative contributions of the left and the right characters in lexical decision, the depth to which semantic radicals are processed, and how nonlinguistic factors affect lexical processes. Mixed-effects regression analyses of response times and subgaze durations (i.e., first-pass fixation time spent on each of the two characters) revealed joint contributions of morphographic units at all levels of the linguistic structure with the magnitude and the direction of the lexical effects modulated by readers’ locus of attention in a left-to-right preferred processing path. During the early time frame, character effects were larger in magnitude and more robust than radical and whole-word effects, regardless of the font size and the type of nonwords. Extending previous radical-based and character-based models, we propose a task/decision-sensitive character-driven processing model with a level-skipping assumption: Connections from the feature level bypass the lower radical level and link up directly to the higher character level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accelerated life testing (ALT) is widely used to obtain reliability information about a product within a limited time frame. The Cox s proportional hazards (PH) model is often utilized for reliability prediction. My master thesis research focuses on designing accelerated life testing experiments for reliability estimation. We consider multiple step-stress ALT plans with censoring. The optimal stress levels and times of changing the stress levels are investigated. We discuss the optimal designs under three optimality criteria. They are D-, A- and Q-optimal designs. We note that the classical designs are optimal only if the model assumed is correct. Due to the nature of prediction made from ALT experimental data, attained under the stress levels higher than the normal condition, extrapolation is encountered. In such case, the assumed model cannot be tested. Therefore, for possible imprecision in the assumed PH model, the method of construction for robust designs is also explored.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Latent variable models in finance originate both from asset pricing theory and time series analysis. These two strands of literature appeal to two different concepts of latent structures, which are both useful to reduce the dimension of a statistical model specified for a multivariate time series of asset prices. In the CAPM or APT beta pricing models, the dimension reduction is cross-sectional in nature, while in time-series state-space models, dimension is reduced longitudinally by assuming conditional independence between consecutive returns, given a small number of state variables. In this paper, we use the concept of Stochastic Discount Factor (SDF) or pricing kernel as a unifying principle to integrate these two concepts of latent variables. Beta pricing relations amount to characterize the factors as a basis of a vectorial space for the SDF. The coefficients of the SDF with respect to the factors are specified as deterministic functions of some state variables which summarize their dynamics. In beta pricing models, it is often said that only the factorial risk is compensated since the remaining idiosyncratic risk is diversifiable. Implicitly, this argument can be interpreted as a conditional cross-sectional factor structure, that is, a conditional independence between contemporaneous returns of a large number of assets, given a small number of factors, like in standard Factor Analysis. We provide this unifying analysis in the context of conditional equilibrium beta pricing as well as asset pricing with stochastic volatility, stochastic interest rates and other state variables. We address the general issue of econometric specifications of dynamic asset pricing models, which cover the modern literature on conditionally heteroskedastic factor models as well as equilibrium-based asset pricing models with an intertemporal specification of preferences and market fundamentals. We interpret various instantaneous causality relationships between state variables and market fundamentals as leverage effects and discuss their central role relative to the validity of standard CAPM-like stock pricing and preference-free option pricing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proves a new representation theorem for domains with both discrete and continuous variables. The result generalizes Debreu's well-known representation theorem on connected domains. A strengthening of the standard continuity axiom is used in order to guarantee the existence of a representation. A generalization of the main theorem and an application of the more general result are also presented.