900 resultados para Time inventory models
Resumo:
Bioanalytical data from a bioequivalence study were used to develop limited-sampling strategy (LSS) models for estimating the area under the plasma concentration versus time curve (AUC) and the peak plasma concentration (Cmax) of 4-methylaminoantipyrine (MAA), an active metabolite of dipyrone. Twelve healthy adult male volunteers received single 600 mg oral doses of dipyrone in two formulations at a 7-day interval in a randomized, crossover protocol. Plasma concentrations of MAA (N = 336), measured by HPLC, were used to develop LSS models. Linear regression analysis and a "jack-knife" validation procedure revealed that the AUC0-¥ and the Cmax of MAA can be accurately predicted (R²>0.95, bias <1.5%, precision between 3.1 and 8.3%) by LSS models based on two sampling times. Validation tests indicate that the most informative 2-point LSS models developed for one formulation provide good estimates (R²>0.85) of the AUC0-¥ or Cmax for the other formulation. LSS models based on three sampling points (1.5, 4 and 24 h), but using different coefficients for AUC0-¥ and Cmax, predicted the individual values of both parameters for the enrolled volunteers (R²>0.88, bias = -0.65 and -0.37%, precision = 4.3 and 7.4%) as well as for plasma concentration data sets generated by simulation (R²>0.88, bias = -1.9 and 8.5%, precision = 5.2 and 8.7%). Bioequivalence assessment of the dipyrone formulations based on the 90% confidence interval of log-transformed AUC0-¥ and Cmax provided similar results when either the best-estimated or the LSS-derived metrics were used.
Resumo:
This research concerns different statistical methods that assist to increase the demand forecasting accuracy of company X’s forecasting model. Current forecasting process was analyzed in details. As a result, graphical scheme of logical algorithm was developed. Based on the analysis of the algorithm and forecasting errors, all the potential directions for model future improvements in context of its accuracy were gathered into the complete list. Three improvement directions were chosen for further practical research, on their basis, three test models were created and verified. Novelty of this work lies in the methodological approach of the original analysis of the model, which identified its critical points, as well as the uniqueness of the developed test models. Results of the study formed the basis of the grant of the Government of St. Petersburg.
Resumo:
The serious neuropsychological repercussions of hepatic encephalopathy have led to the creation of several experimental models in order to better understand the pathogenesis of the disease. In the present investigation, two possible causes of hepatic encephalopathy, cholestasis and portal hypertension, were chosen to study the behavioral impairments caused by the disease using an object recognition task. This working memory test is based on a paradigm of spontaneous delayed non-matching to sample and was performed 60 days after surgery. Male Wistar rats (225-250 g) were divided into three groups: two experimental groups, microsurgical cholestasis (N = 20) and extrahepatic portal hypertension (N = 20), and a control group (N = 20). A mild alteration of the recognition memory occurred in rats with cholestasis compared to control rats and portal hypertensive rats. The latter group showed the poorest performance on the basis of the behavioral indexes tested. In particular, only the control group spent significantly more time exploring novel objects compared to familiar ones (P < 0.001). In addition, the portal hypertension group spent the shortest time exploring both the novel and familiar objects (P < 0.001). These results suggest that the existence of portosystemic collateral circulation per se may be responsible for subclinical encephalopathy.
Resumo:
Erythrina velutina (EV) and Erythrina mulungu (EM), popularly used in Brazil as tranquilizing agents, were studied. The effects of acute and chronic oral treatment with a water:alcohol extract of EV (7:3, plant grounded stem bark; acute = 100, 200, 400 mg/kg; chronic = 50, 100, 200 mg/kg) were evaluated in rats (N = 11-12) submitted to the elevated T-maze (for avoidance and escape measurements) model of anxiety. This model was selected for its presumed capacity to elicit specific subtypes of anxiety disorders recognized in clinical practice: avoidance has been related to generalized anxiety and escape to panic. Additionally, animals were treated with the same doses of EV and EM (water:alcohol 7:3, inflorescence extract) and submitted to the forced swim test for the evaluation of antidepressant activity (N = 7-10). Both treatment regimens with EV impaired elevated T-maze avoidance latencies, without altering escape, in a way similar to the reference drug diazepam (avoidance 1, mean ± SEM, acute study: 131.1 ± 45.5 (control), 9.0 ± 3.3 (diazepam), 12.7 ± 2.9 (200 mg/kg), 28.8 ± 15.3 (400 mg/kg); chronic study: 131.7 ± 46.9 (control), 35.8 ± 29.7 (diazepam), 24.4 ± 10.4 (50 mg/kg), 29.7 ± 11.5 (200 mg/kg)). Neither EV nor EM altered measurements performed in the forced swim test, in contrast to the reference drug imipramine that significantly decreased immobility time after chronic treatment. These results were not due to motor alterations since no significant effects were detected in an open field. These observations suggest that EV exerts anxiolytic-like effects on a specific subset of defensive behaviors which have been associated with generalized anxiety disorder.
Resumo:
Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.
Resumo:
Prenatal immune challenge (PIC) in pregnant rodents produces offspring with abnormalities in behavior, histology, and gene expression that are reminiscent of schizophrenia and autism. Based on this, the goal of this article was to review the main contributions of PIC models, especially the one using the viral-mimetic particle polyriboinosinic-polyribocytidylic acid (poly-I:C), to the understanding of the etiology, biological basis and treatment of schizophrenia. This systematic review consisted of a search of available web databases (PubMed, SciELO, LILACS, PsycINFO, and ISI Web of Knowledge) for original studies published in the last 10 years (May 2001 to October 2011) concerning animal models of PIC, focusing on those using poly-I:C. The results showed that the PIC model with poly-I:C is able to mimic the prodrome and both the positive and negative/cognitive dimensions of schizophrenia, depending on the specific gestation time window of the immune challenge. The model resembles the neurobiology and etiology of schizophrenia and has good predictive value. In conclusion, this model is a robust tool for the identification of novel molecular targets during prenatal life, adolescence and adulthood that might contribute to the development of preventive and/or treatment strategies (targeting specific symptoms, i.e., positive or negative/cognitive) for this devastating mental disorder, also presenting biosafety as compared to viral infection models. One limitation of this model is the incapacity to model the full spectrum of immune responses normally induced by viral exposure.
Resumo:
Maintenance of thermal homeostasis in rats fed a high-fat diet (HFD) is associated with changes in their thermal balance. The thermodynamic relationship between heat dissipation and energy storage is altered by the ingestion of high-energy diet content. Observation of thermal registers of core temperature behavior, in humans and rodents, permits identification of some characteristics of time series, such as autoreference and stationarity that fit adequately to a stochastic analysis. To identify this change, we used, for the first time, a stochastic autoregressive model, the concepts of which match those associated with physiological systems involved and applied in male HFD rats compared with their appropriate standard food intake age-matched male controls (n=7 per group). By analyzing a recorded temperature time series, we were able to identify when thermal homeostasis would be affected by a new diet. The autoregressive time series model (AR model) was used to predict the occurrence of thermal homeostasis, and this model proved to be very effective in distinguishing such a physiological disorder. Thus, we infer from the results of our study that maximum entropy distribution as a means for stochastic characterization of temperature time series registers may be established as an important and early tool to aid in the diagnosis and prevention of metabolic diseases due to their ability to detect small variations in thermal profile.
Resumo:
The viscoelastic properties of edible films can provide information at the structural level of the biopolymers used. The objective of this work was to test three simple models of linear viscoelastic theory (Maxwell, Generalized Maxwell with two units in parallel, and Burgers) using the results of stress relaxation tests in edible films of myofibrillar proteins of Nile Tilapia. The films were elaborated according to a casting technique and pre-conditioned at 58% relative humidity and 22ºC for 4 days. The testing sample (15mm x 118mm) was submitted to tests of stress relaxation in an equipment of physical measurements, TA.XT2i. The deformation, imposed to the sample, was 1%, guaranteeing the permanency in the domain of the linear viscoelasticity. The models were fitted to experimental data (stress x time) by nonlinear regression. The Generalized Maxwell model with two units in parallel and the Burgers model represented the relaxation curves of stress satisfactorily. The viscoelastic properties varied in a way that they were less dependent on the thickness of the films.
Resumo:
The freezing times of fruit pulp models packed and conditioned in multi-layered boxes were evaluated under conditions similar to those employed commercially. Estimating the freezing time is a difficult practice due to the presence of significant voids in the boxes, whose influence may be analyzed by means of various methods. In this study, a procedure for estimating freezing time by using the models described in the literature was compared with experimental measurements by collecting time/temperature data. The following results show that the airflow through packages is a significant parameter for freezing time estimation. When the presence of preferential channels was considered, the predicted freezing time in the models could be 10% lower than the experimental values, depending on the method. The isotherms traced as a function of the location of the samples inside the boxes showed the displacement of the thermal center in relation to the geometric center of the product.
Resumo:
The costs of health care are going up in many countries. In order to provide affordable and effective health care solutions, new technologies and approaches are constantly being developed. In this research, video games are presented as a possible solution to the problem. Video games are fun, and nowadays most people like to spend time on them. In addition, recent studies have pointed out that video games can have notable health benefits. Health games have already been developed, used in practice, and researched. However, the bulk of health game studies have been concerned with the design or the effectiveness of the games; no actual business studies have been conducted on the subject, even though health games often lack commercial success despite their health benefits. This thesis seeks to fill this gap. The specific aim of this thesis is to develop a conceptual business model framework and empirically use it in explorative medical game business model research. In the first stage of this research, a literature review was conducted and the existing literature analyzed and synthesized into a conceptual business model framework consisting of six dimensions. The motivation behind the synthesis is the ongoing ambiguity around the business model concept. In the second stage, 22 semi-structured interviews were conducted with different professionals within the value network for medical games. The business model framework was present in all stages of the empirical research: First, in the data collection stage, the framework acted as a guiding instrument, focusing the interview process. Then, the interviews were coded and analyzed using the framework as a structure. The results were then reported following the structure of the framework. In the results, the interviewees highlighted several important considerations and issues for medical games concerning the six dimensions of the business model framework. Based on the key findings of this research, several key components of business models for medical games were identified and illustrated in a single figure. Furthermore, five notable challenges for business models for medical games were presented, and possible solutions for the challenges were postulated. Theoretically, these findings provide pioneering information on the untouched subject of business models for medical games. Moreover, the conceptual business model framework and its use in the novel context of medical games provide a contribution to the business model literature. Regarding practice, this thesis further accentuates that medical games can offer notable benefits to several stakeholder groups and offers advice to companies seeking to commercialize these games.
Resumo:
Restructuring by adding Sodium Alginate or Microbial Transglutaminase (MTGase) using cold gelation technology make it possible to obtain many different raw products from minced and/or chopped fish muscle that are suitable for being used as the basis of new restructured products with different physicochemical properties and even different compositions. Special consideration must be given to their shelf-life and the changes that may take place during chilling, both in visual appearance and physicochemical properties. After chilled storage, the restructured models made with different muscular particle size and composition at low temperature (5 °C), it was observed that microbial growth limited the shelf-life to 7-14 days. Mechanical properties increased (p < 0.05) during that time, and higher values were observed in samples elaborated by joining small muscle particle size than in those elaborated by homogenization. There was no clear increase in the cooking yield and purge loss, and no significant colour change (p > 0.05) was detected during storage.
Stochastic particle models: mean reversion and burgers dynamics. An application to commodity markets
Resumo:
The aim of this study is to propose a stochastic model for commodity markets linked with the Burgers equation from fluid dynamics. We construct a stochastic particles method for commodity markets, in which particles represent market participants. A discontinuity in the model is included through an interacting kernel equal to the Heaviside function and its link with the Burgers equation is given. The Burgers equation and the connection of this model with stochastic differential equations are also studied. Further, based on the law of large numbers, we prove the convergence, for large N, of a system of stochastic differential equations describing the evolution of the prices of N traders to a deterministic partial differential equation of Burgers type. Numerical experiments highlight the success of the new proposal in modeling some commodity markets, and this is confirmed by the ability of the model to reproduce price spikes when their effects occur in a sufficiently long period of time.
Resumo:
This thesis was conducted on assignment by a multinational chemical corporation as a case study. The purpose of this study is to find ways to improve the purchasing process for small purchases at the case company. The improvements looked after are mainly cost and time savings. Purchasing process is the process that starts from the requisition of goods or services and ends when the invoice is paid. In this thesis the purchases with value less than 1000€ are considered to be small. The theoretical framework of the thesis consists of general theoretical view of costs and performance of the purchasing process, different types of purchasing processes and a model for improving purchasing processes. The categorization to small and large purchases is the most important followed by the division between direct and indirect purchases. Also models that provide more strategic perspective for categorization were found to be useful. Auditing and managerial control are important parts of the purchasing process. When considering the transaction costs of purchasing from the costs–benefits perspective large and small purchases should not have the same processes. Purchasing cards, e-procurement and vendor managed inventory are seen as an alternative to the traditional purchasing process. The empirical data collection was done by interviewing the company employees that take part of the purchasing process in their daily work. The interviews had open-ended questions and the answers were coded and analyzed. The results consist of process description and assessment as well as suggestions for potential improvements. At the case company the basic purchasing process was similar to the traditional purchasing process that is entirely done with computers and online. For some categories there was already more sophisticated e-procurement solutions in use. To improve the current e-procurement based solutions elimination of authorization workflow and better information exchange can be seen as potential improvements for most of the case purchases. Purchasing cards and a lightweight form of vendor managed inventory can be seen as potential improvements for some categories. Implementing the changes incurs at least some cost and the benefits might be hard to measure. This thesis has revealed that the small purchases have potential for significant cost and time savings at the case company.
Resumo:
A feature-based fitness function is applied in a genetic programming system to synthesize stochastic gene regulatory network models whose behaviour is defined by a time course of protein expression levels. Typically, when targeting time series data, the fitness function is based on a sum-of-errors involving the values of the fluctuating signal. While this approach is successful in many instances, its performance can deteriorate in the presence of noise. This thesis explores a fitness measure determined from a set of statistical features characterizing the time series' sequence of values, rather than the actual values themselves. Through a series of experiments involving symbolic regression with added noise and gene regulatory network models based on the stochastic 'if-calculus, it is shown to successfully target oscillating and non-oscillating signals. This practical and versatile fitness function offers an alternate approach, worthy of consideration for use in algorithms that evaluate noisy or stochastic behaviour.
Resumo:
If you want to know whether a property is true or not in a specific algebraic structure,you need to test that property on the given structure. This can be done by hand, which can be cumbersome and erroneous. In addition, the time consumed in testing depends on the size of the structure where the property is applied. We present an implementation of a system for finding counterexamples and testing properties of models of first-order theories. This system is supposed to provide a convenient and paperless environment for researchers and students investigating or studying such models and algebraic structures in particular. To implement a first-order theory in the system, a suitable first-order language.( and some axioms are required. The components of a language are given by a collection of variables, a set of predicate symbols, and a set of operation symbols. Variables and operation symbols are used to build terms. Terms, predicate symbols, and the usual logical connectives are used to build formulas. A first-order theory now consists of a language together with a set of closed formulas, i.e. formulas without free occurrences of variables. The set of formulas is also called the axioms of the theory. The system uses several different formats to allow the user to specify languages, to define axioms and theories and to create models. Besides the obvious operations and tests on these structures, we have introduced the notion of a functor between classes of models in order to generate more co~plex models from given ones automatically. As an example, we will use the system to create several lattices structures starting from a model of the theory of pre-orders.