918 resultados para Optimal Sampling Time
Resumo:
Mixed models may be defined with or without reference to sampling, and can be used to predict realized random effects, as when estimating the latent values of study subjects measured with response error. When the model is specified without reference to sampling, a simple mixed model includes two random variables, one stemming from an exchangeable distribution of latent values of study subjects and the other, from the study subjects` response error distributions. Positive probabilities are assigned to both potentially realizable responses and artificial responses that are not potentially realizable, resulting in artificial latent values. In contrast, finite population mixed models represent the two-stage process of sampling subjects and measuring their responses, where positive probabilities are only assigned to potentially realizable responses. A comparison of the estimators over the same potentially realizable responses indicates that the optimal linear mixed model estimator (the usual best linear unbiased predictor, BLUP) is often (but not always) more accurate than the comparable finite population mixed model estimator (the FPMM BLUP). We examine a simple example and provide the basis for a broader discussion of the role of conditioning, sampling, and model assumptions in developing inference.
Resumo:
There are several versions of the lognormal distribution in the statistical literature, one is based in the exponential transformation of generalized normal distribution (GN). This paper presents the Bayesian analysis for the generalized lognormal distribution (logGN) considering independent non-informative Jeffreys distributions for the parameters as well as the procedure for implementing the Gibbs sampler to obtain the posterior distributions of parameters. The results are used to analyze failure time models with right-censored and uncensored data. The proposed method is illustrated using actual failure time data of computers.
Resumo:
The analytical determination of atmospheric pollutants still presents challenges due to the low-level concentrations (frequently in the mu g m(-3) range) and their variations with sampling site and time In this work a capillary membrane diffusion scrubber (CMDS) was scaled down to match with capillary electrophoresis (CE) a quick separation technique that requires nothing more than some nanoliters of sample and when combined with capacitively coupled contactless conductometric detection (C(4)D) is particularly favorable for ionic species that do not absorb in the UV-vis region like the target analytes formaldehyde formic acid acetic acid and ammonium The CMDS was coaxially assembled inside a PTFE tube and fed with acceptor phase (deionized water for species with a high Henry s constant such as formaldehyde and carboxylic acids or acidic solution for ammonia sampling with equilibrium displacement to the non-volatile ammonium ion) at a low flow rate (8 3 nLs(-1)) while the sample was aspirated through the annular gap of the concentric tubes at 25 mLs(-1) A second unit in all similar to the CMDS was operated as a capillary membrane diffusion emitter (CMDE) generating a gas flow with know concentrations of ammonia for the evaluation of the CMDS The fluids of the system were driven with inexpensive aquarium air pumps and the collected samples were stored in vials cooled by a Peltier element Complete protocols were developed for the analysis in air of NH(3) CH(3)COOH HCOOH and with a derivatization setup CH(2)O by associating the CMDS collection with the determination by CE-C(4)D The ammonia concentrations obtained by electrophoresis were checked against the reference spectrophotometric method based on Berthelot s reaction Sensitivity enhancements of this reference method were achieved by using a modified Berthelot reaction solenoid micro-pumps for liquid propulsion and a long optical path cell based on a liquid core waveguide (LCW) All techniques and methods of this work are in line with the green analytical chemistry trends (C) 2010 Elsevier B V All rights reserved
Resumo:
Compared to other volatile carbonylic compounds present in outdoor air, formaldehyde (CH2O) is the most toxic, deserving more attention in terms of indoor and outdoor air quality legislation and control. The analytical determination of CH2O in air still presents challenges due to the low-level concentration (in the sub-ppb range) and its variation with sampling site and time. Of the many available analytical methods for carbonylic compounds, the most widespread one is the time consuming collection in cartridges impregnated with 2,4-dinitrophenylhydrazine followed by the analysis of the formed hydrazones by HPLC. The present work proposes the use of polypropylene hollow porous capillary fibers to achieve efficient CH2O collection. The Oxyphan (R) fiber (designed for blood oxygenation) was chosen for this purpose because it presents good mechanical resistance, high density of very fine pores and high ratio of collection area to volume of the acceptor fluid in the tube, all favorable for the development of air sampling apparatus. The collector device consists of a Teflon pipe inside of which a bundle of polypropylene microporous capillary membranes was introduced. While the acceptor passes at a low flow rate through the capillaries, the sampled air circulates around the fibers, impelled by a low flow membrane pump (of the type used for aquariums ventilation). The coupling of this sampling technique with the selective and quantitative determination of CH2O, in the form of hydroxymethanesulfonate (HMS) after derivatization with HSO3-, by capillary electrophoresis with capacitively coupled contactless conductivity detection (CE-(CD)-D-4) enabled the development of a complete analytical protocol for the CH2O evaluation in air. (C) 2008 Published by Elsevier B.V.
Resumo:
To connect different electrical, network and data devices with the minimum cost and shortest path, is a complex job. In huge buildings, where the devices are placed at different locations on different floors and only some specific routes are available to pass the cables and buses, the shortest path search becomes more complex. The aim of this thesis project is, to develop an application which indentifies the best path to connect all objects or devices by following the specific routes.To address the above issue we adopted three algorithms Greedy Algorithm, Simulated Annealing and Exhaustive search and analyzed their results. The given problem is similar to Travelling Salesman Problem. Exhaustive search is a best algorithm to solve this problem as it checks each and every possibility and give the accurate result but it is an impractical solution because of huge time consumption. If no. of objects increased from 12 it takes hours to search the shortest path. Simulated annealing is emerged with some promising results with lower time cost. As of probabilistic nature, Simulated annealing could be non optimal but it gives a near optimal solution in a reasonable duration. Greedy algorithm is not a good choice for this problem. So, simulated annealing is proved best algorithm for this problem. The project has been implemented in C-language which takes input and store output in an Excel Workbook
Resumo:
Research question: How has the presence of PT employees affected the role of managers in the Swedish food retail business? Research purpose: The purpose of this paper was to describe the change that accompanies part-time employment from a management perspective, and particularly, describe how the presence of part-time employment has influenced the role of the manager within the Swedish food retail business. Conceptual framework: The main focused in this chapter is directed towards the role of managers. The basis of the conceptual framework consist of the model developed by Mintzberg including the ten managerial roles and Quinn's eight leadership roles and how the presence of PT employments might affect these roles. Methodology: In this paper, the authors adopted a qualitative design and used narrative inquiry as a research strategy in order to gain a deep understanding of the context. Semi- structured interviews have been collected through a self-selection sampling and the total number of participants was ten. Conclusions: Based on the findings of this paper the presence of PT employees have not influenced and changed the role of managers. The changes that have influenced and caused the change of the role of the managers constitutes of the increased workload, the delegations of tasks and responsibilities, changed positions, the change of the organisational structure of the individual store, and the increased workforce.
Resumo:
Bin planning (arrangements) is a key factor in the timber industry. Improper planning of the storage bins may lead to inefficient transportation of resources, which threaten the overall efficiency and thereby limit the profit margins of sawmills. To address this challenge, a simulation model has been developed. However, as numerous alternatives are available for arranging bins, simulating all possibilities will take an enormous amount of time and it is computationally infeasible. A discrete-event simulation model incorporating meta-heuristic algorithms has therefore been investigated in this study. Preliminary investigations indicate that the results achieved by GA based simulation model are promising and better than the other meta-heuristic algorithm. Further, a sensitivity analysis has been done on the GA based optimal arrangement which contributes to gaining insights and knowledge about the real system that ultimately leads to improved and enhanced efficiency in sawmill yards. It is expected that the results achieved in the work will support timber industries in making optimal decisions with respect to arrangement of storage bins in a sawmill yard.
Resumo:
Drinking water utilities in urban areas are focused on finding smart solutions facing new challenges in their real-time operation because of limited water resources, intensive energy requirements, a growing population, a costly and ageing infrastructure, increasingly stringent regulations, and increased attention towards the environmental impact of water use. Such challenges force water managers to monitor and control not only water supply and distribution, but also consumer demand. This paper presents and discusses novel methodologies and procedures towards an integrated water resource management system based on advanced ICT technologies of automation and telecommunications for largely improving the efficiency of drinking water networks (DWN) in terms of water use, energy consumption, water loss minimization, and water quality guarantees. In particular, the paper addresses the first results of the European project EFFINET (FP7-ICT2011-8-318556) devoted to the monitoring and control of the DWN in Barcelona (Spain). Results are split in two levels according to different management objectives: (i) the monitoring level is concerned with all the aspects involved in the observation of the current state of a system and the detection/diagnosis of abnormal situations. It is achieved through sensors and communications technology, together with mathematical models; (ii) the control level is concerned with computing the best suitable and admissible control strategies for network actuators as to optimize a given set of operational goals related to the performance of the overall system. This level covers the network control (optimal management of water and energy) and the demand management (smart metering, efficient supply). The consideration of the Barcelona DWN as the case study will allow to prove the general applicability of the proposed integrated ICT solutions and their effectiveness in the management of DWN, with considerable savings of electricity costs and reduced water loss while ensuring the high European standards of water quality to citizens.
Resumo:
Drinking water distribution networks risk exposure to malicious or accidental contamination. Several levels of responses are conceivable. One of them consists to install a sensor network to monitor the system on real time. Once a contamination has been detected, this is also important to take appropriate counter-measures. In the SMaRT-OnlineWDN project, this relies on modeling to predict both hydraulics and water quality. An online model use makes identification of the contaminant source and simulation of the contaminated area possible. The objective of this paper is to present SMaRT-OnlineWDN experience and research results for hydraulic state estimation with sampling frequency of few minutes. A least squares problem with bound constraints is formulated to adjust demand class coefficient to best fit the observed values at a given time. The criterion is a Huber function to limit the influence of outliers. A Tikhonov regularization is introduced for consideration of prior information on the parameter vector. Then the Levenberg-Marquardt algorithm is applied that use derivative information for limiting the number of iterations. Confidence intervals for the state prediction are also given. The results are presented and discussed on real networks in France and Germany.
Resumo:
Most of water distribution systems (WDS) need rehabilitation due to aging infrastructure leading to decreasing capacity, increasing leakage and consequently low performance of the WDS. However an appropriate strategy including location and time of pipeline rehabilitation in a WDS with respect to a limited budget is the main challenge which has been addressed frequently by researchers and practitioners. On the other hand, selection of appropriate rehabilitation technique and material types is another main issue which has yet to address properly. The latter can affect the environmental impacts of a rehabilitation strategy meeting the challenges of global warming mitigation and consequent climate change. This paper presents a multi-objective optimization model for rehabilitation strategy in WDS addressing the abovementioned criteria mainly focused on greenhouse gas (GHG) emissions either directly from fossil fuel and electricity or indirectly from embodied energy of materials. Thus, the objective functions are to minimise: (1) the total cost of rehabilitation including capital and operational costs; (2) the leakage amount; (3) GHG emissions. The Pareto optimal front containing optimal solutions is determined using Non-dominated Sorting Genetic Algorithm NSGA-II. Decision variables in this optimisation problem are classified into a number of groups as: (1) percentage proportion of each rehabilitation technique each year; (2) material types of new pipeline for rehabilitation each year. Rehabilitation techniques used here includes replacement, rehabilitation and lining, cleaning, pipe duplication. The developed model is demonstrated through its application to a Mahalat WDS located in central part of Iran. The rehabilitation strategy is analysed for a 40 year planning horizon. A number of conventional techniques for selecting pipes for rehabilitation are analysed in this study. The results show that the optimal rehabilitation strategy considering GHG emissions is able to successfully save the total expenses, efficiently decrease the leakage amount from the WDS whilst meeting environmental criteria.
Resumo:
Several works in the shopping-time and in the human-capital literature, due to the nonconcavity of the underlying Hamiltonian, use Örst-order conditions in dynamic optimization to characterize necessity, but not su¢ ciency, in intertemporal problems. In this work I choose one paper in each one of these two areas and show that optimality can be characterized by means of a simple aplication of Arrowís (1968) su¢ ciency theorem.
Resumo:
When policy rules are changed, the effect of nominal rigidities should be modelled through endogenous pricing rules. We endogenize Taylor (1979) type pricing rule to examine the output effects of monetary disinflations. We derive optimal fixed-price time-dependent rules in inflationary steady states and during disinflations. We also develop a methodology to aggregate individual pricing rules which vary through disinflation. This allows us to reevaluate the output costs of monetary disinflation, including aspects as the role of the initial leveI of inflation and the importance of the degree of credibility of the policy change.
Resumo:
We consider the problem of time consistency of the Ramsey monetary and fiscal policies in an economy without capital. Following Lucas and Stokey (1983) we allow the government at date t to leave its successor at t + 1 a profile of real and nominal debt of all maturities, as a way to influence its decisions. We show that the Ramsey policies are time consistent if and only if the Friedman rule is the optimal Ramsey policy.
Resumo:
We discuss a general approach to building non-asymptotic confidence bounds for stochastic optimization problems. Our principal contribution is the observation that a Sample Average Approximation of a problem supplies upper and lower bounds for the optimal value of the problem which are essentially better than the quality of the corresponding optimal solutions. At the same time, such bounds are more reliable than “standard” confidence bounds obtained through the asymptotic approach. We also discuss bounding the optimal value of MinMax Stochastic Optimization and stochastically constrained problems. We conclude with a small simulation study illustrating the numerical behavior of the proposed bounds.
Resumo:
The extracellular glycerol kinase gene from Saccharomyces cerevisiae (GUT]) was cloned into the expression vector pPICZ alpha. A and integrated into the genome of the methylotrophic yeast Pichia pastoris X-33. The presence of the GUT1 insert was confirmed by PCR analysis. Four clones were selected and the functionality of the recombinant enzyme was assayed. Among the tested clones, one exhibited glycerol kinase activity of 0.32 U/mL, with specific activity of 0.025 U/mg of protein. A medium optimized for maximum biomass production by recombinant Pichia pastoris in shaker cultures was initially explored, using 2.31 % (by volume) glycerol as the carbon source. Optimization was carried out by response surface methodology (RSM). In preliminary experiments, following a Plackett-Burman design, glycerol volume fraction (phi(Gly)) and growth time (t) were selected as the most important factors in biomass production. Therefore, subsequent experiments, carried out to optimize biomass production, followed a central composite rotatable design as a function of phi(Gly) and time. Glycerol volume fraction proved to have a significant positive linear effect on biomass production. Also, time was a significant factor (at linear positive and quadratic levels) in biomass production. Experimental data were well fitted by a convex surface representing a second order polynomial model, in which biomass is a function of both factors (R(2)=0.946). Yield and specific activity of glycerol kinase were mainly affected by the additions of glycerol and methanol to the medium. The optimized medium composition for enzyme production was: 1 % yeast extract, 1 % peptone, 100 mM potassium phosphate buffer, pH=6.0, 1.34 % yeast nitrogen base (YNB), 4.10(-5) % biotin, 1 %, methanol and 1 %, glycerol, reaching 0.89 U/mL of glycerol kinase activity and 14.55 g/L of total protein in the medium after 48 h of growth.