949 resultados para empirical methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Linkage and association studies are major analytical tools to search for susceptibility genes for complex diseases. With the availability of large collection of single nucleotide polymorphisms (SNPs) and the rapid progresses for high throughput genotyping technologies, together with the ambitious goals of the International HapMap Project, genetic markers covering the whole genome will be available for genome-wide linkage and association studies. In order not to inflate the type I error rate in performing genome-wide linkage and association studies, multiple adjustment for the significant level for each independent linkage and/or association test is required, and this has led to the suggestion of genome-wide significant cut-off as low as 5 × 10 −7. Almost no linkage and/or association study can meet such a stringent threshold by the standard statistical methods. Developing new statistics with high power is urgently needed to tackle this problem. This dissertation proposes and explores a class of novel test statistics that can be used in both population-based and family-based genetic data by employing a completely new strategy, which uses nonlinear transformation of the sample means to construct test statistics for linkage and association studies. Extensive simulation studies are used to illustrate the properties of the nonlinear test statistics. Power calculations are performed using both analytical and empirical methods. Finally, real data sets are analyzed with the nonlinear test statistics. Results show that the nonlinear test statistics have correct type I error rates, and most of the studied nonlinear test statistics have higher power than the standard chi-square test. This dissertation introduces a new idea to design novel test statistics with high power and might open new ways to mapping susceptibility genes for complex diseases. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Twenty-nine surface samples from the Portuguese shelf, recovered offshore from the mouths of the Ave, Douro, Lis and Mira rivers, were analysed using ICP-OES for selected major and trace elements, after total dissolution. Organic carbon, carbonate content and grain size were also determined. Five evaluation tools have been applied in order to compare the three study areas and to evaluate sediment geochemistry and other sediment compositional variability in the acquired samples: (1) empirical methods based on comparison with standard reference criteria, e.g. the NOAA sediment quality guidelines, (2) normalisation ratios using a grain-size proxy element, (3) "Gradient Method", plotting contaminant vs. organic matter or Al, (4) definition of a regional geochemical baseline from a compiled database, and (5) enrichment factors. The evaluation of element and component associations indicates differences related both to the onshore drainage areas and to the environmental shelf setting. Despite the considerable variability in total metal contents indicated by our results, the sediment metal composition is largely of natural origin. Metal enrichments observed in the Mira area are associated with the drainage of mineralised areas rich in Cu, Pb, Zn, Fe and Mn. The near absence of human impact on shelf sediments, despite the vicinity to urban areas with high industrialisation levels, such as the Ave-Douro and Lis areas, is attributed to effective trapping in the estuaries and coastal zones, as well dilution with less contaminated sediments shelf sediments and removal with fine fractions due to grain-size sorting. The character of the contaminated sediments transported to these shelf areas is further influenced by grain-size sorting as well as by dilution with less contaminated marine sediments. The results obtained individually by the different methods complement each other and allow more specific interpretations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper is concerned with the study of the berth system in port terminals. The main objective is to present the management methodologies, which include empirical methods, analytical methods and simulation methods The comparison shows that these three methods are not independent, but they are complementary. Each method has advantages and limitations and these depend on the type of study performed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

La heterogeneidad del medio geológico introduce en el proyecto de obra subterránea un alto grado de incertidumbre que debe ser debidamente gestionado a fin de reducir los riesgos asociados, que son fundamentalmente de tipo geotécnico. Entre los principales problemas a los que se enfrenta la Mecánica de Rocas moderna en el ámbito de la construcción subterránea, se encuentran la fluencia de roca en túneles (squeezing) y la rotura de pilares de carbón. Es ampliamente conocido que su aparición causa importantes perjuicios en el coste y la seguridad de los proyectos por lo que su estudio, ha estado tradicionalmente vinculado a la predicción de su ocurrencia. Entre las soluciones existentes para la determinación de estos problemas se encuentran las que se basan en métodos analíticos y numéricos. Estas metodologías son capaces de proporcionar un alto nivel de representatividad respecto del comportamiento geotécnico real, sin embargo, su utilización solo es posible cuando se dispone de una suficiente caracterización geotécnica y por tanto de una detallada definición de los parámetros que alimentan los complejos modelos constitutivos y criterios de rotura que los fenómenos estudiados requieren. Como es lógico, este nivel de definición solo es posible cuando se alcanzan etapas avanzadas de proyecto, incluso durante la propia construcción, a fin de calibrar adecuadamente los parámetros introducidos en los modelos, lo que supone una limitación de uso en etapas iniciales, cuando su predicción tiene verdadero sentido. Por su parte, los métodos empíricos permiten proporcionar soluciones a estos complejos problemas de un modo sencillo, con una baja parametrización y, dado su eminente enfoque observacional, de gran fiabilidad cuando se implementan sobre condiciones de contorno similares a las originales. La sencillez y escasez de los parámetros utilizados permiten a estas metodologías ser utilizadas desde las fases preliminares del proyecto, ya que estos constituyen en general, información habitual de fácil y económica adquisición. Este aspecto permite por tanto incorporar la predicción desde el principio del proceso de diseño, anticipando el riesgo en origen. En esta tesis doctoral, se presenta una nueva metodología empírica que sirve para proporcionar predicciones para la ocurrencia de squeezing y el fallo de pilares de carbón basada en una extensa recopilación de información de casos reales de túneles y minas en las que ambos fenómenos fueron evaluados. Esta información, recogida de referencias bibliográficas de prestigio, ha permitido recopilar una de las más extensas bases de datos existentes hasta la fecha relativa a estos fenómenos, lo que supone en sí mismo una importante contribución sobre el estado del arte. Con toda esta información, y con la ayuda de la teoría de clasificadores estadísticos, se ha implementado sobre las bases de datos un clasificador lineal de tipo regresión logística que permite hacer predicciones sobre la ocurrencia de ambos fenómenos en términos de probabilidad, y por tanto ponderar la incertidumbre asociada a la heterogeneidad incorporada por el medio geológico. Este aspecto del desarrollo es el verdadero valor añadido proporcionado por la tesis y la principal ventaja de la solución propuesta respecto de otras metodologías empíricas. Esta capacidad de ponderación probabilística permite al clasificador constituir una solución muy interesante como metodología para la evaluación de riesgo geotécnico y la toma de decisiones. De hecho, y como ejercicio de validación práctica, se ha implementado la solución desarrollada en un modelo coste-beneficio asociado a la optimización del diseño de pilares involucrados en una de mina “virtual” explotada por tajos largos. La capacidad del clasificador para cuantificar la probabilidad de fallo del diseño, junto con una adecuada cuantificación de las consecuencias de ese fallo, ha permitido definir una ley de riesgo que se ha incorporado al balance de costes y beneficios, que es capaz, a partir del redimensionamiento iterativo del sistema de pilares y de la propia configuración de la mina, maximizar el resultado económico del proyecto minero bajo unas condiciones de seguridad aceptables, fijadas de antemano. Geological media variability introduces to the subterranean project a high grade of uncertainty that should be properly managed with the aim to reduce the associated risks, which are mainly geotechnical. Among the major problems facing the modern Rock Mechanics in the field of underground construction are both, the rock squeezing while tunneling and the failure of coal pillars. Given their harmfulness to the cost and safety of the projects, their study has been traditionally linked to the determination of its occurrence. Among the existing solutions for the determination of these problems are those that are based on analytical and numerical methods. Those methodologies allow providing a high level of reliability of the geotechnical behavior, and therefore a detailed definition of the parameters that feed the complex constitutive models and failure criteria that require the studied phenomena. Obviously, this level of definition is only possible when advanced stages of the project are achieved and even during construction in order to properly calibrate the parameters entered in the models, which suppose a limited use in early stages, when the prediction has true sense. Meanwhile, empirical methods provide solutions to these complex problems in a simple way, with low parameterization and, given his observational scope, with highly reliability when implemented on similar conditions to the original context. The simplicity and scarcity of the parameters used allow these methodologies be applied in the early stages of the project, since that information should be commonly easy and cheaply to get. This aspect can therefore incorporate the prediction from the beginning of the design process, anticipating the risk beforehand. This thesis, based on the extensive data collection of case histories of tunnels and underground mines, presents a novel empirical approach used to provide predictions for the occurrence of both, squeezing and coal pillars failures. The information has been collected from prestigious references, providing one of the largest databases to date concerning phenomena, a fact which provides an important contribution to the state of the art. With all this information, and with the aid of the theory of statistical classifiers, it has been implemented on both databases, a type linear logistic regression classifier that allows predictions about the occurrence of these phenomena in terms of probability, and therefore weighting the uncertainty associated with geological variability. This aspect of the development is the real added value provided by the thesis and the main advantage of the proposed solution over other empirical methodologies. This probabilistic weighting capacity, allows being the classifier a very interesting methodology for the evaluation of geotechnical risk and decision making. In fact, in order to provide a practical validation, we have implemented the developed solution within a cost-benefit analysis associated with the optimization of the design of coal pillar systems involved in a "virtual" longwall mine. The ability of the classifier to quantify the probability of failure of the design along with proper quantification of the consequences of that failure, has allowed defining a risk law which is introduced into the cost-benefits model, which is able, from iterative resizing of the pillar system and the configuration of the mine, maximize the economic performance of the mining project under acceptable safety conditions established beforehand.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper provides an overview of an ongoing research project work: “A Polytechnical Bilingual Dictionary of Metaphors: Spanish-English/English-Spanish” done by the UPM consolidated research group “DISCYT” (Estudios Cognitivos del Discurso Científico-Técnico). A detailed explanation of the method adopted to identify key metaphors collected from the different subject areas is included. Drawing from recognized empirical methods (Pragglejaz 2007, Cameron 2007, Steen 2007), the examples have been examined according to the main tenets of conceptual metaphor and conceptual integration theory (Deignan 2005, Gibbs 2008, Lakoff 1993, Lakoff & Johnson 1999, Steen 2007, Fauconnier & Turner 2008). This forthcoming dictionary comprises metaphors of over 10 scientific and technical areas such as Aeronautical engineering, Agronomy, Architecture, Biotechnology, Civil engineering, Geology and Mining, Mechanical engineering, Nanotechnology, Naval and Maritime engineering, Sports and Telecommunications. In this paper, we focus on the study of examples taken from civil engineering, materials engineering and naval engineering. Representative cases are analyzed from several points of view (multimodal metaphor, linguistic information strategies and translation into target language) highlighting cross linguistic variations between Spanish and English.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We define a capacity reserve model to dimension passenger car service installations according to the demographic distribution of the area to be serviced by using hospital?s emergency room analogies. Usually, service facilities are designed applying empirical methods, but customers arrive under uncertain conditions not included in the original estimations, and there is a gap between customer?s real demand and the service?s capacity. Our research establishes a valid methodology and covers the absence of recent researches and the lack of statistical techniques implementation, integrating demand uncertainty in a unique model built in stages by implementing ARIMA forecasting, queuing theory, and Monte Carlo simulation to optimize the service capacity and occupancy, minimizing the implicit cost of the capacity that must be reserved to service unexpected customers. Our model has proved to be a useful tool for optimal decision making under uncertainty integrating the prediction of the cost implicit in the reserve capacity to serve unexpected demand and defining a set of new process indicators, such us capacity, occupancy, and cost of capacity reserve never studied before. The new indicators are intended to optimize the service operation. This set of new indicators could be implemented in the information systems used in the passenger car services.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Los tratamientos biopelícula fueron unos de los primeros tratamientos biológicos que se aplicaron en las aguas residuales. Los tratamientos biopelícula presentan importantes ventajas frente a los cultivos en suspensión, sin embargo, el control de los tratamientos biopelícula es complicado y su modelización también. Las bases teóricas del comportamiento de las biopelículas empezaron a desarrollarse fundamentalmente a partir de los años 80. Dado que el proceso es complejo con ecuaciones de difícil resolución, estas conceptualizaciones han sido consideradas durante años como ejercicios matemáticos más que como herramientas de diseño y simulación. Los diseños de los reactores estaban basados en experiencias de plantas piloto o en comportamientos empíricos de determinadas plantas. Las ecuaciones de diseño eran regresiones de los datos empíricos. La aplicabilidad de las ecuaciones se reducía a las condiciones particulares de la planta de la que provenían los datos empíricos. De tal forma que existía una gran variedad y diversidad de ecuaciones empíricas para cada tipo de reactor. La investigación médica durante los años 90 centró su atención en la formación y eliminación de las biopelículas. Gracias al desarrollo de nuevas prácticas de laboratorio que permitían estudiar el interior de las biopelículas y gracias también al aumento de la capacidad de los ordenadores, la simulación del comportamiento de las biopelículas tomó un nuevo impulso en esta década. El desarrollo de un tipo de biopelículas, fangos granulares, en condiciones aerobias realizando simultaneamente procesos de eliminación de nutrientes ha sido recientemente patentado. Esta patente ha recibido numerosos premios y reconocimientos internacionales tales como la Eurpean Invention Award (2012). En 1995 se descubrió que determinadas bacterias podían realizar un nuevo proceso de eliminación de nitrógeno denominado Anammox. Este nuevo tipo de proceso de eliminación de nitrógeno tiene el potencial de ofrecer importantes mejoras en el rendimiento de eliminación y en el consumo de energía. En los últimos 10 años, se han desarrollado una serie de tratamientos denominados “innovadores” de eliminación de nutrientes. Dado que no resulta posible el establecimiento de estas bacterias Anammox en fangos activos convencionales, normalmente se recurre al uso de cultivos biopelícula. La investigación se ha centrado en el desarrollo de estos procesos innovadores en cultivos biopelícula, en particular en los fangos granulares y MBBR e IFAs, con el objeto de establecer las condiciones bajo las cuales estos procesos se pueden desarrollar de forma estable. Muchas empresas y organizaciones buscan una segunda patente. Una cuestión principal en el desarrollo de estos procesos se encuentra la correcta selección de las condiciones ambientales y de operación para que unas bacterias desplacen a otras en el interior de las biopelículas. El diseño de plantas basado en cultivos biopelícula con procesos convencionales se ha realizado normalmente mediante el uso de métodos empíricos y semi-empíricos. Sin embargo, los criterios de selección avanzados aplicados en los Tratamientos Innovadores de Eliminación de Nitrógeno unido a la complejidad de los mecanismos de transporte de sustratos y crecimiento de la biomasa en las biopelículas, hace necesario el uso de herramientas de modelización para poder conclusiones no evidentes. Biofilms were one of the first biological treatments used in the wastewater treatment. Biofilms exhibit important advantages over suspended growth activated sludge. However, controlling biofilms growth is complicated and likewise its simulation. The theoretical underpinnings of biofilms performance began to be developed during 80s. As the equations that govern the growth of biofilms are complex and its resolution is challenging, these conceptualisations have been considered for years as mathematical exercises instead of practical design and simulation tools. The design of biofilm reactors has been based on performance information of pilot plants and specific plants. Most of the times, the designing equations were simple regressions of empirical data. The applicability of these equations were confined to the particular conditions of the plant from where the data came from. Consequently, there were a wide range of design equations for each type of reactor During 90s medical research focused its efforts on how biofilm´s growth with the ultimate goal of avoiding it. Thanks to the development of new laboratory techniques that allowed the study the interior of the biofilms and thanks as well to the development of the computers, simulation of biofilms’ performance had a considerable evolution during this decade. In 1995 it was discovered that certain bacteria can carry out a new sort of nutrient removal process named Anammox. This new type of nutrient removal process potentially can enhance considerably the removal performance and the energy consumption. In the last decade, it has been developed a range of treatments based on the Anammox generally named “Innovative Nutrient Removal Treatments”. As it is not possible to cultivate Anammox bacteria in activated sludge, normally scientists and designers resort to the use of biofilms. A critical issue in the development of these innovative processes is the correct selection of environment and operation conditions so as to certain bacterial population displace to others bacteria within the biofilm. The design of biofilm technology plants is normally based on the use of empirical and semi-empirical methods. However, the advanced control strategies used in the Innovative Nutrient Removal Processes together with the complexity of the mass transfer and biomass growth in biofilms, require the use of modeling tools to be able to set non evident conclusions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

El objetivo de esta investigación consiste en definir un modelo de reserva de capacidad, por analogías con emergencias hospitalarias, que pueda ser implementado en el sector de servicios. Este está específicamente enfocado a su aplicación en talleres de servicio de automóviles. Nuestra investigación incorpora la incertidumbre de la demanda en un modelo singular diseñado en etapas que agrupa técnicas ARIMA, teoría de colas y simulación Monte Carlo para definir los conceptos de capacidad y ocupación de servicio, que serán utilizados para minimizar el coste implícito de la reserva capacidad necesaria para atender a clientes que carecen de cita previa. Habitualmente, las compañías automovilísticas estiman la capacidad de sus instalaciones de servicio empíricamente, pero los clientes pueden llegar bajo condiciones de incertidumbre que no se tienen en cuenta en dichas estimaciones, por lo que existe una diferencia entre lo que el cliente realmente demanda y la capacidad que ofrece el servicio. Nuestro enfoque define una metodología válida para el sector automovilístico que cubre la ausencia genérica de investigaciones recientes y la habitual falta de aplicación de técnicas estadísticas en el sector. La equivalencia con la gestión de urgencias hospitalarias se ha validado a lo largo de la investigación en la se definen nuevos indicadores de proceso (KPIs) Tal y como hacen los hospitales, aplicamos modelos estocásticos para dimensionar las instalaciones de servicio de acuerdo con la distribución demográfica del área de influencia. El modelo final propuesto integra la predicción del coste implícito en la reserva de capacidad para atender la demanda no prevista. Asimismo, se ha desarrollado un código en Matlab que puede integrarse como un módulo adicional a los sistemas de información (DMS) que se usan actualmente en el sector, con el fin de emplear los nuevos indicadores de proceso definidos en el modelo. Los resultados principales del modelo son nuevos indicadores de servicio, tales como la capacidad, ocupación y coste de reserva de capacidad, que nunca antes han sido objeto de estudio en la industria automovilística, y que están orientados a gestionar la operativa del servicio. ABSTRACT Our aim is to define a Capacity Reserve model to be implemented in the service sector by hospital's emergency room (ER) analogies, with a practical approach to passenger car services. A stochastic model has been implemented using R and a Monte Carlo simulation code written in Matlab and has proved a very useful tool for optimal decision making under uncertainty. The research integrates demand uncertainty in a unique model which is built in stages by implementing ARIMA forecasting, Queuing Theory and a Monte Carlo simulation to define the concepts of service capacity and occupancy, minimizing the implicit cost of the capacity that must be reserved to service unexpected customers. Usually, passenger car companies estimate their service facilities capacity using empirical methods, but customers arrive under uncertain conditions not included in the estimations. Thus, there is a gap between customer’s real demand and the dealer’s capacity. This research sets a valid methodology for the passenger car industry to cover the generic absence of recent researches and the generic lack of statistical techniques implementation. The hospital’s emergency room (ER) equalization has been confirmed to be valid for the passenger car industry and new process indicators have been defined to support the study. As hospitals do, we aim to apply stochastic models to dimension installations according to the demographic distribution of the area to be serviced. The proposed model integrates the prediction of the cost implicit in the reserve capacity to serve unexpected demand. The Matlab code could be implemented as part of the existing information technology systems (ITs) to support the existing service management tools, creating a set of new process indicators. Main model outputs are new indicators, such us Capacity, Occupancy and Cost of Capacity Reserve, never studied in the passenger car service industry before, and intended to manage the service operation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In light of the clinical importance of satisfaction in psychological assessments, the lack of research related to consultative assessment, and the absence of empirical methods to measure the satisfaction of referring professionals in consultative assessments, the Consultative Assessment Questionnaire (C-AQ) was developed. The measure assesses the satisfaction of the referring professional with a consultative assessment. It was created using a rational-empirical approach. Using confirmed perspective content analysis five initial scales were developed. This measure has many important research and clinical applications related to measuring the effectiveness of consultative assessments. The C-AQ will be further refined and validity data will be collected in a second phase of this project.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Science has been developed from the rational-empirical methods, having as a consequence, the representation of existing phenomena without understanding the root causes. The question which currently has is the sense of the being, and in a simplified way, one can say that the dogmatic religion lead to misinterpretations, the empirical sciences contain the exact rational representations of phenomena. Thus, Science has been able to get rid of the dogmatic religion. The project for the sciences of being looks to return to reality its essential foundations; under the plan of theory of systems necessarily involves a search for the meaning of Reality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tese de mestrado, Geologia do Ambiente, Riscos Geológicos e Ordenamento do TerritórioUniversidade de Lisboa, Faculdade de Ciências, 2016

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The reliability of measurement refers to unsystematic error in observed responses. Investigations of the prevalence of random error in stated estimates of willingness to pay (WTP) are important to an understanding of why tests of validity in CV can fail. However, published reliability studies have tended to adopt empirical methods that have practical and conceptual limitations when applied to WTP responses. This contention is supported in a review of contingent valuation reliability studies that demonstrate important limitations of existing approaches to WTP reliability. It is argued that empirical assessments of the reliability of contingent values may be better dealt with by using multiple indicators to measure the latent WTP distribution. This latent variable approach is demonstrated with data obtained from a WTP study for stormwater pollution abatement. Attitude variables were employed as a way of assessing the reliability of open-ended WTP (with benchmarked payment cards) for stormwater pollution abatement. The results indicated that participants' decisions to pay were reliably measured, but not the magnitude of the WTP bids. This finding highlights the need to better discern what is actually being measured in VVTP studies, (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aluminium (Al) is known to be neurotoxic and has been associated with the aetiology of Alzheimer's Disease. To date, only desferrioxamine (DFO), a trihydroxamic acid siderophore has been used in the clinical environment for the removal of Al from the body. However, this drug is expensive, orally inactive and is associated with many side effects. These studies employed a theoretical approach, with the use of quantum mechanics (QM) via semi-empirical molecular orbital (MO) calculations, and a practical approach using U87-MG glioblastoma cells as a model for evaluating the influence of potential chelators on the passage of aluminium into cells. Preliminary studies involving the Cambridge Structural Database (CSD) identified that Al prefers binding to bidentate ligands in a 3:1 manner, whereby oxygen was the exclusive donating atom. Statistically significant differences in M-O bond lengths when compared to other trivalent metal ions such as Fe3+ were established and used as an acceptance criterion for subsequent MO calculations. Of the semi-empirical methods parameterised for Al, the PM3 Hamiltonian was found to give the most reliable final optimised geometries of simple 3:1 Al complexes. Consequently the PM3 Hamiltonian was used for evaluating the Hf of 3:1 complexes with more complicated ligands. No correlation exists between published stability constants and individual parameters calculated via PM3 optimisations, although investigation of the dicarboxylates reveals a correlation of 0.961 showing promise for affinity prediction of closely related ligands. A simple and inexpensive morin spectrofluorescence assay has been developed and optimised producing results comparable to atomic absorption spectroscopy methods for the quantitative analysis of Al. This assay was used in subsequent in vitro models, initially on E. coli, which indicated that Al inhibits the antimicrobial action of ciprofloxacin, a potent quinolone antibiotic. Ensuing studies using the second model, U87-MG cells, investigated the influence of chelators on the transmembrane transport of Al, identifying 1,2-diethylhydroxypyridin-4-one as a ligand showing greatest potential for chelating Al in the clinical situation. In conclusion, these studies have explored semi-empirical MO Hamiltonians and an in-vitro U87-MG cell line, both as possible methods for predicting effective chelators of Al.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the past decade, several experienced Operational Researchers have advanced the view that the theoretical aspects of model building have raced ahead of the ability of people to use them. Consequently, the impact of Operational Research on commercial organisations and the public sector is limited, and many systems fail to achieve their anticipated benefits in full. The primary objective of this study is to examine a complex interactive Stock Control system, and identify the reasons for the differences between the theoretical expectations and the operational performance. The methodology used is to hypothesise all the possible factors which could cause a divergence between theory and practice, and to evaluate numerically the effect each of these factors has on two main control indices - Service Level and Average Stock Value. Both analytical and empirical methods are used, and simulation is employed extensively. The factors are divided into two main categories for analysis - theoretical imperfections in the model, and the usage of the system by Buyers. No evidence could be found in the literature of any previous attempts to place the differences between theory and practice in a system in quantitative perspective nor, more specifically, to study the effects of Buyer/computer interaction in a Stock Control system. The study reveals that, in general, the human factors influencing performance are of a much higher order of magnitude than the theoretical factors, thus providing objective evidence to support the original premise. The most important finding is that, by judicious intervention into an automatic stock control algorithm, it is possible for Buyers to produce results which not only attain but surpass the algorithmic predictions. However, the complexity and behavioural recalcitrance of these systems are such that an innately numerate, enquiring type of Buyer needs to be inducted to realise the performance potential of the overall man/computer system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Historically, recombinant membrane protein production has been a major challenge meaning that many fewer membrane protein structures have been published than those of soluble proteins. However, there has been a recent, almost exponential increase in the number of membrane protein structures being deposited in the Protein Data Bank. This suggests that empirical methods are now available that can ensure the required protein supply for these difficult targets. This review focuses on methods that are available for protein production in yeast, which is an important source of recombinant eukaryotic membrane proteins. We provide an overview of approaches to optimize the expression plasmid, host cell and culture conditions, as well as the extraction and purification of functional protein for crystallization trials in preparation for structural studies.