940 resultados para Multi-cicle, Expectation, and Conditional Estimation Method


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work studied the drying kinetics of the organic fractions of municipal solid waste (MSW) samples with different initial moisture contents and presented a new method for determination of drying kinetic parameters. A series of drying experiments at different temperatures were performed by using a thermogravimetric technique. Based on the modified Page drying model and the general pattern search method, a new drying kinetic method was developed using multiple isothermal drying curves simultaneously. The new method fitted the experimental data more accurately than the traditional method. Drying kinetic behaviors under extrapolated conditions were also predicted and validated. The new method indicated that the drying activation energies for the samples with initial moisture contents of 31.1 and 17.2 % on wet basis were 25.97 and 24.73 kJ mol−1. These results are useful for drying process simulation and industrial dryer design. This new method can be also applied to determine the drying parameters of other materials with high reliability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intermittent exporting is something of a puzzle. In theory, exporting represents a major commitment, and is often the starting point for further internationalisation. However, intermittent exporters exit and subsequently re-enter exporting, sometimes frequently. We develop a conceptual model to explain how firm characteristics and market conditions interact to affect the decision to exit and re-enter exporting, and model this process using an extensive dataset of French manufacturing firms from 1997 to 2007. As anticipated, smaller and less productive firms are more likely to exit exporting, and react more strongly to changes in both domestic and foreign markets than larger firms. Exit and re-entry are closely linked. Firms with a low exit probability also have a high likelihood of re-entry, and vice versa. However, the way in which firms react to market conditions at the time of exit matters greatly in determining the likelihood of re-entry: thus re-entry depends crucially on the strategic rationale for exit. Our analysis helps explain the opportunistic and intermittent exporting of (mainly) small firms, the demand conditions under which intermittent exporting is most likely to occur, and the firm attributes most likely to give rise to such behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation contributes to the rapidly growing empirical research area in the field of operations management. It contains two essays, tackling two different sets of operations management questions which are motivated by and built on field data sets from two very different industries --- air cargo logistics and retailing.

The first essay, based on the data set obtained from a world leading third-party logistics company, develops a novel and general Bayesian hierarchical learning framework for estimating customers' spillover learning, that is, customers' learning about the quality of a service (or product) from their previous experiences with similar yet not identical services. We then apply our model to the data set to study how customers' experiences from shipping on a particular route affect their future decisions about shipping not only on that route, but also on other routes serviced by the same logistics company. We find that customers indeed borrow experiences from similar but different services to update their quality beliefs that determine future purchase decisions. Also, service quality beliefs have a significant impact on their future purchasing decisions. Moreover, customers are risk averse; they are averse to not only experience variability but also belief uncertainty (i.e., customer's uncertainty about their beliefs). Finally, belief uncertainty affects customers' utilities more compared to experience variability.

The second essay is based on a data set obtained from a large Chinese supermarket chain, which contains sales as well as both wholesale and retail prices of un-packaged perishable vegetables. Recognizing the special characteristics of this particularly product category, we develop a structural estimation model in a discrete-continuous choice model framework. Building on this framework, we then study an optimization model for joint pricing and inventory management strategies of multiple products, which aims at improving the company's profit from direct sales and at the same time reducing food waste and thus improving social welfare.

Collectively, the studies in this dissertation provide useful modeling ideas, decision tools, insights, and guidance for firms to utilize vast sales and operations data to devise more effective business strategies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visualization and interpretation of geological observations into a cohesive geological model are essential to Earth sciences and related fields. Various emerging technologies offer approaches to multi-scale visualization of heterogeneous data, providing new opportunities that facilitate model development and interpretation processes. These include increased accessibility to 3D scanning technology, global connectivity, and Web-based interactive platforms. The geological sciences and geological engineering disciplines are adopting these technologies as volumes of data and physical samples greatly increase. However, a standardized and universally agreed upon workflow and approach have yet to properly be developed. In this thesis, the 3D scanning workflow is presented as a foundation for a virtual geological database. This database provides augmented levels of tangibility to students and researchers who have little to no access to locations that are remote or inaccessible. A Web-GIS platform was utilized jointly with customized widgets developed throughout the course of this research to aid in visualizing hand-sized/meso-scale geological samples within a geologic and geospatial context. This context is provided as a macro-scale GIS interface, where geophysical and geodetic images and data are visualized. Specifically, an interactive interface is developed that allows for simultaneous visualization to improve the understanding of geological trends and relationships. These developed tools will allow for rapid data access and global sharing, and will facilitate comprehension of geological models using multi-scale heterogeneous observations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates topology optimization of energy absorbing structures in which material damage is accounted for in the optimization process. The optimization objective is to design the lightest structures that are able to absorb the required mechanical energy. A structural continuity constraint check is introduced that is able to detect when no feasible load path remains in the finite element model, usually as a result of large scale fracture. This assures that designs do not fail when loaded under the conditions prescribed in the design requirements. This continuity constraint check is automated and requires no intervention from the analyst once the optimization process is initiated. Consequently, the optimization algorithm proceeds towards evolving an energy absorbing structure with the minimum structural mass that is not susceptible to global structural failure. A method is also introduced to determine when the optimization process should halt. The method identifies when the optimization method has plateaued and is no longer likely to provide improved designs if continued for further iterations. This provides the designer with a rational method to determine the necessary time to run the optimization and avoid wasting computational resources on unnecessary iterations. A case study is presented to demonstrate the use of this method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Amphibian skin secretions are unique sources of bioactive molecules, particularly bioactive peptides. In this study, the skin secretion of the white-lipped tree frog (Litoria infrafrenata) was obtained to identify peptides with putative therapeutic potential. By utilizing skin secretion-derived mRNA, a cDNA library was constructed, a frenatin gene was cloned and its encoded peptides were deduced and confirmed using RP-HPLC, MALDI-TOF and MS/MS. The deduced peptides were identified as frenatin 4.1 (GFLEKLKTGAKDFASAFVNSIKGT) and a post-translationally modified peptide, frenatin 4.2 (GFLEKLKTGAKDFASAFVNSIK.NH2). Antimicrobial activity of the peptides was assessed by determining their minimal inhibitory concentrations (MICs) using standard model microorganisms. Through studying structure–activity relationships, analogues of the two peptides were designed, resulting in synthesis of frenatin 4.1a (GFLEKLKKGAKDFASALVNSIKGT) and frenatin 4.2a (GFLLKLKLGAKLFASAFVNSIK.NH2). Both analogues exhibited improved antimicrobial activities, especially frenatin 4.2a, which displayed significant enhancement of broad spectrum antimicrobial efficiency. The peptide modifications applied in this study, may provide new ideas for the generation of leads for the design of antimicrobial peptides with therapeutic applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dans l’industrie de l’aluminium, le coke de pétrole calciné est considéré comme étant le composant principal de l’anode. Une diminution dans la qualité du coke de pétrole a été observée suite à une augmentation de sa concentration en impuretés. Cela est très important pour les alumineries car ces impuretés, en plus d’avoir un effet réducteur sur la performance des anodes, contaminent le métal produit. Le coke de pétrole est aussi une source de carbone fossile et, durant sa consommation, lors du processus d’électrolyse, il y a production de CO2. Ce dernier est considéré comme un gaz à effet de serre et il est bien connu pour son rôle dans le réchauffement planétaire et aussi dans les changements climatiques. Le charbon de bois est disponible et est produit mondialement en grande quantité. Il pourrait être une alternative attrayante pour le coke de pétrole dans la fabrication des anodes de carbone utilisées dans les cuves d’électrolyse pour la production de l’aluminium. Toutefois, puisqu’il ne répond pas aux critères de fabrication des anodes, son utilisation représente donc un grand défi. En effet, ses principaux désavantages connus sont sa grande porosité, sa structure désordonnée et son haut taux de minéraux. De plus, sa densité et sa conductivité électrique ont été rapportées comme étant inférieures à celles du coke de pétrole. L’objectif de ce travail est d’explorer l’effet du traitement de chaleur sur les propriétés du charbon de bois et cela, dans le but de trouver celles qui s’approchent le plus des spécifications requises pour la production des anodes. L’évolution de la structure du charbon de bois calciné à haute température a été suivie à l’aide de différentes techniques. La réduction de son contenu en minéraux a été obtenue suite à des traitements avec de l’acide chlorhydrique utilisé à différentes concentrations. Finalement, différentes combinaisons de ces deux traitements, calcination et lixiviation, ont été essayées dans le but de trouver les meilleures conditions de traitement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Excess nutrient loads carried by streams and rivers are a great concern for environmental resource managers. In agricultural regions, excess loads are transported downstream to receiving water bodies, potentially causing algal blooms, which could lead to numerous ecological problems. To better understand nutrient load transport, and to develop appropriate water management plans, it is important to have accurate estimates of annual nutrient loads. This study used a Monte Carlo sub-sampling method and error-corrected statistical models to estimate annual nitrate-N loads from two watersheds in central Illinois. The performance of three load estimation methods (the seven-parameter log-linear model, the ratio estimator, and the flow-weighted averaging estimator) applied at one-, two-, four-, six-, and eight-week sampling frequencies were compared. Five error correction techniques; the existing composite method, and four new error correction techniques developed in this study; were applied to each combination of sampling frequency and load estimation method. On average, the most accurate error reduction technique, (proportional rectangular) resulted in 15% and 30% more accurate load estimates when compared to the most accurate uncorrected load estimation method (ratio estimator) for the two watersheds. Using error correction methods, it is possible to design more cost-effective monitoring plans by achieving the same load estimation accuracy with fewer observations. Finally, the optimum combinations of monitoring threshold and sampling frequency that minimizes the number of samples required to achieve specified levels of accuracy in load estimation were determined. For one- to three-weeks sampling frequencies, combined threshold/fixed-interval monitoring approaches produced the best outcomes, while fixed-interval-only approaches produced the most accurate results for four- to eight-weeks sampling frequencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Includes index.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Intelligent agents offer a new and exciting way of understanding the world of work. Agent-Based Simulation (ABS), one way of using intelligent agents, carries great potential for progressing our understanding of management practices and how they link to retail performance. We have developed simulation models based on research by a multi-disciplinary team of economists, work psychologists and computer scientists. We will discuss our experiences of implementing these concepts working with a well-known retail department store. There is no doubt that management practices are linked to the performance of an organisation (Reynolds et al., 2005; Wall & Wood, 2005). Best practices have been developed, but when it comes down to the actual application of these guidelines considerable ambiguity remains regarding their effectiveness within particular contexts (Siebers et al., forthcoming a). Most Operational Research (OR) methods can only be used as analysis tools once management practices have been implemented. Often they are not very useful for giving answers to speculative ‘what-if’ questions, particularly when one is interested in the development of the system over time rather than just the state of the system at a certain point in time. Simulation can be used to analyse the operation of dynamic and stochastic systems. ABS is particularly useful when complex interactions between system entities exist, such as autonomous decision making or negotiation. In an ABS model the researcher explicitly describes the decision process of simulated actors at the micro level. Structures emerge at the macro level as a result of the actions of the agents and their interactions with other agents and the environment. We will show how ABS experiments can deal with testing and optimising management practices such as training, empowerment or teamwork. Hence, questions such as “will staff setting their own break times improve performance?” can be investigated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Executing a cloud or aerosol physical properties retrieval algorithm from controlled synthetic data is an important step in retrieval algorithm development. Synthetic data can help answer questions about the sensitivity and performance of the algorithm or aid in determining how an existing retrieval algorithm may perform with a planned sensor. Synthetic data can also help in solving issues that may have surfaced in the retrieval results. Synthetic data become very important when other validation methods, such as field campaigns,are of limited scope. These tend to be of relatively short duration and often are costly. Ground stations have limited spatial coverage whilesynthetic data can cover large spatial and temporal scales and a wide variety of conditions at a low cost. In this work I develop an advanced cloud and aerosol retrieval simulator for the MODIS instrument, also known as Multi-sensor Cloud and Aerosol Retrieval Simulator (MCARS). In a close collaboration with the modeling community I have seamlessly combined the GEOS-5 global climate model with the DISORT radiative transfer code, widely used by the remote sensing community, with the observations from the MODIS instrument to create the simulator. With the MCARS simulator it was then possible to solve the long standing issue with the MODIS aerosol optical depth retrievals that had a low bias for smoke aerosols. MODIS aerosol retrieval did not account for effects of humidity on smoke aerosols. The MCARS simulator also revealed an issue that has not been recognized previously, namely,the value of fine mode fraction could create a linear dependence between retrieved aerosol optical depth and land surface reflectance. MCARS provided the ability to examine aerosol retrievals against “ground truth” for hundreds of thousands of simultaneous samples for an area covered by only three AERONET ground stations. Findings from MCARS are already being used to improve the performance of operational MODIS aerosol properties retrieval algorithms. The modeling community will use the MCARS data to create new parameterizations for aerosol properties as a function of properties of the atmospheric column and gain the ability to correct any assimilated retrieval data that may display similar dependencies in comparisons with ground measurements.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this project is to develop a three-dimensional block model for a garnet deposit in the Alder Gulch, Madison County, Montana. Garnets occur in pre-Cambrian metamorphic Red Wash gneiss and similar rocks in the vicinity. This project seeks to model the percentage of garnet in a deposit called the Section 25 deposit using the Surpac software. Data available for this work are drillhole, trench and grab sample data obtained from previous exploration of the deposit. The creation of the block model involves validating the data, creating composites of assayed garnet percentages and conducting basic statistics on composites using Surpac statistical tools. Variogram analysis will be conducted on composites to quantify the continuity of the garnet mineralization. A three-dimensional block model will be created and filled with estimates of garnet percentage using different methods of reserve estimation and the results compared.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Amid the trend of rising health expenditure in developed economies, changing the healthcare delivery models is an important point of action for service regulators to contain this trend. Such a change is mostly induced by either financial incentives or regulatory tools issued by the regulators and targeting service providers and patients. This creates a tripartite interaction between service regulators, professionals, and patients that manifests a multi-principal agent relationship, in which professionals are agents to two principals: regulators and patients. This thesis is concerned with such a multi-principal agent relationship in healthcare and attempts to investigate the determinants of the (non-)compliance to regulatory tools in light of this tripartite relationship. In addition, the thesis provides insights into the different institutional, economic, and regulatory settings, which govern the multi-principal agent relationship in healthcare in different countries. Furthermore, the thesis provides and empirically tests a conceptual framework of the possible determinants of (non-)compliance by physicians to regulatory tools issued by the regulator. The main findings of the thesis are first, in a multi-principal agent setting, the utilization of financial incentives to align the objectives of professionals and the regulator is important but not the only solution. This finding is based on the heterogeneity in the financial incentives provided to professionals in different health markets, which does not provide a one-size-fits-all model of financial incentives to influence clinical decisions. Second, soft law tools as clinical practice guidelines (CPGs) are important tools to mitigate the problems of the multi-principal agent setting in health markets as they reduce information asymmetries while preserving the autonomy of professionals. Third, CPGs are complex and heterogeneous and so are the determinants of (non-)compliance to them. Fourth, CPGs work but under conditions. Factors such as intra-professional competition between service providers or practitioners might lead to non-compliance to CPGs – if CPGs are likely to reduce the professional’s utility. Finally, different degrees of soft law mandate have different effects on providers’ compliance. Generally, the stronger the mandate, the stronger the compliance, however, even with a strong mandate, drivers such as intra-professional competition and co-management of patients by different professionals affected the (non-)compliance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As dietas de baixo índice glicêmico e baixa carga glicêmica têm sido associadas à redução do risco de doenças crônicas. Por esse motivo há um interesse crescente na sua aplicação para avaliação e orientação nutricional. No entanto, existem limitações quanto ao uso de dados publicados de índice glicêmico e carga glicêmica, pela variedade e formas de processamento dos alimentos vegetais existentes. Devido à dificuldade de realização de ensaios in vivo, uma vez que são custosos, trabalhosos, invasivos e necessitam de período considerável de experimentação, foram desenvolvidas metodologias in vitro que, a partir da velocidade de digestão dos carboidratos, permitem estimar o índice glicêmico dos alimentos de forma prática, simples e econômica. O presente trabalho apresenta o uso de um marcador in vitro, o índice de hidrólise, na estimativa do índice glicêmico e da carga glicêmica, o método mais empregado por pesquisadores brasileiros, visando à sua aplicação por profissionais da área de Nutrição. Os cálculos e as interpretações para estimativa do Índice glicêmico e da carga glicêmica são apresentados por meio de um exemplo prático com alguns alimentos brasileiros e com o grão de amaranto submetido a diferentes processamentos. Na ausência de dados referentes à resposta glicêmica do alimento de interesse, os valores do marcador in vitro podem ser utilizados para estimar o índice glicêmico e a carga glicêmica dos alimentos. Porém, este marcador não deve ser utilizado indiscriminadamente, uma vez que leva em consideração apenas os fatores intrínsecos aos alimentos que influenciam o aproveitamento dos carboidratos disponíveis.