979 resultados para Best-improvement
Resumo:
Treatment of carotid artery stenosis decreases the long-term risk of stroke and may enhance cerebral blood flow. It is therefore expected to have the potential to prevent cognitive decline or even improve cognition over the long-term. However, intervention itself can cause peri-interventional cerebral infarcts, possibly resulting in a decline of cognitive performance, at least for a short time. We investigated the long-term effects of three treatment methods on cognition and the emotional state one year after intervention. In this prospective observational cohort study, 58 patients with extracranial carotid artery stenosis (≥70%) underwent magnetic resonance imaging and assessment of cognition, mood and motor speed before carotid endarterectomy (n = 20), carotid stenting (n = 10) or best medical treatment (n = 28) (i.e., time-point 1 [TP1]), and at one-year follow-up (TP2). Gain scores, reflecting cognitive change after treatment, were built according to performance as (TP2 -TP1)/TP1. Independent of the treatment type, significant improvement in frontal lobe functions, visual memory and motor speed was found. Performance level, motor speed and mood at TP1 were negatively correlated with gain scores, with greater improvement in patients with low performance before treatment. Active therapy, whether conservative or interventional, produces significant improvement of frontal lobe functions and memory in patients with carotid artery disease, independent of treatment type. This effect was particularly pronounced in patients with low cognitive performance prior to treatment.
Resumo:
OBJECTIVES Improvement of skin fibrosis is part of the natural course of diffuse cutaneous systemic sclerosis (dcSSc). Recognising those patients most likely to improve could help tailoring clinical management and cohort enrichment for clinical trials. In this study, we aimed to identify predictors for improvement of skin fibrosis in patients with dcSSc. METHODS We performed a longitudinal analysis of the European Scleroderma Trials And Research (EUSTAR) registry including patients with dcSSc, fulfilling American College of Rheumatology criteria, baseline modified Rodnan skin score (mRSS) ≥7 and follow-up mRSS at 12±2 months. The primary outcome was skin improvement (decrease in mRSS of >5 points and ≥25%) at 1 year follow-up. A respective increase in mRSS was considered progression. Candidate predictors for skin improvement were selected by expert opinion and logistic regression with bootstrap validation was applied. RESULTS From the 919 patients included, 218 (24%) improved and 95 (10%) progressed. Eleven candidate predictors for skin improvement were analysed. The final model identified high baseline mRSS and absence of tendon friction rubs as independent predictors of skin improvement. The baseline mRSS was the strongest predictor of skin improvement, independent of disease duration. An upper threshold between 18 and 25 performed best in enriching for progressors over regressors. CONCLUSIONS Patients with advanced skin fibrosis at baseline and absence of tendon friction rubs are more likely to regress in the next year than patients with milder skin fibrosis. These evidence-based data can be implemented in clinical trial design to minimise the inclusion of patients who would regress under standard of care.
Resumo:
Since its introduction into the United States in the 1980s, crack cocaine has been a harsh epidemic that has taken its toll on a countless number of people. This highly addictive, cheap and readily available drug of abuse has permeated many demographic sectors, mostly in low income, lesser educated, and urban communities. This epidemic of crack cocaine use in inner city areas across the Unites States has been described as an expression of economic marginality and “social suffering” coupled with the local and international forces of drug market economies (Agar 2003). As crack cocaine is a derivative of cocaine, it utilizes the psychoactive component of the drug, but delivers it in a much stronger, quicker, and more addictive fashion. This, coupled with its ready availability and cheap price has allowed for users to not only become very addicted very quickly, but to be subject to the stringent and sometimes unequal or inconsistent punishments for possession and distribution of crack-cocaine. ^ There are many public health and social ramifications from the abuse of crack-cocaine, and these epidemics appear to target low income and minority groups. Public health issues relating to the physical, mental, and economic strain will be addressed, as well as the direct and indirect effects of the punishments that come as a result of the disparity in penalties for cocaine and crack-cocaine possession and distribution. ^ Three new policies have recently been introduced into the United Stated Congress that actively address the disparity in sentencing for drug and criminal activities. They are, (1) Powder-Crack Cocaine Penalty Equalization Act of 2009, (HR 18, 111th Cong. 2009), (2) The Drug Sentencing Reform and Cocaine Kingpin Trafficking Act of 2009, (HR 265, 111th Cong. 2009) and (3) The Justice Integrity Act of 2009, (111th Cong. 2009). ^ Although they have only been initiated, if passed, they have potential to not only eliminate the crack-cocaine disparity, but to enact laws that help those affected by this epidemic. The final and overarching goal of this paper is to analyze and ultimately choose the ideal policy that would not only eliminate the cocaine and crack disparity regardless of current or future state statutes, but will provide the best method of rehabilitation, prevention, and justice. ^
Resumo:
Public health departments play an important role in promoting and preserving the health of communities. The lack of a system to ensure their quality and accountability led to the development of a national voluntary accreditation program by Public Health Accreditation Board (PHAB). The concept that accreditation will lead to quality improvement in public health which will ultimately lead to healthy communities seems intuitive but lacks a robust body of evidence. A critical review of literature was conducted to explore if accreditation can lead to quality improvement in public health. The articles were selected from publically available databases using a specific set of criteria for inclusion, exclusion, and appraisal. To understand the relationship between accreditation and quality improvement, the potential strengths and limitations of accreditation process were evaluated. Recommendations for best practices are suggested so that public health accreditation can yield maximum benefits. A logic model framework to help depict the impact of accreditation on various levels of public health outcomes is also discussed in this thesis. The literature review shows that existing accreditation programs in other industries show limited but encouraging evidence that accreditation will improve quality and strengthen the delivery of public health services. While progress in introducing accreditation in public health can be informed by other accredited industries, the public health field has its own set of challenges. Providing incentives, creating financing strategies, and having a strong leadership will allow greater access to accreditation by all public health departments. The suggested recommendations include that continuous evaluation, public participation, systems approach, clear vision, and dynamic standards should become hallmarks of the accreditation process. Understanding the link between accreditation, quality improvement, and health outcomes will influence the successful adoption and implementation of the public health accreditation program. This review of literature suggests that accreditation is an important step in improving the quality of public health departments and in ultimately improving the health of communities. However, accreditation should be considered in an integrated system of tools and approaches to improve the public health practice. Hence, it is a means to an end - not an end unto itself.^
Resumo:
This study aims to analyze households' attitude toward flood risk in Cotonou in the sense to identify whether they are willing or not to leave the flood-prone zones. Moreover, the attitudes toward the management of wastes and dirty water are analyzed. The data used in this study were obtained from two sources: the survey implemented during March 2011 on one hundred and fifty randomly selected households living in flood-prone areas of Cotonou, and Benin Living Standard Survey of 2006 (Part relative to Cotonou on 1,586 households). Moreover, climate data were used in this study. Multinomial probability model is used for the econometric analysis of the attitude toward flood risk. While the attitudes toward the management of wastes and dirty water are analyzed through a simple logit. The results show that 55.3% of households agreed to go elsewhere while 44.7% refused [we are better-off here (10.67%), due to the proximity of the activities (19.33), the best way is to build infrastructures that will protect against flood and family house (14.67%)]. The authorities have to rethink an alternative policy to what they have been doing such as building socio-economic houses outside Cotonou and propose to the households that are living the areas prone to inundation. Moreover, access to formal education has to be reinforced.
Resumo:
One important task in the design of an antenna is to carry out an analysis to find out the characteristics of the antenna that best fulfills the specifications fixed by the application. After that, a prototype is manufactured and the next stage in design process is to check if the radiation pattern differs from the designed one. Besides the radiation pattern, other radiation parameters like directivity, gain, impedance, beamwidth, efficiency, polarization, etc. must be also evaluated. For this purpose, accurate antenna measurement techniques are needed in order to know exactly the actual electromagnetic behavior of the antenna under test. Due to this fact, most of the measurements are performed in anechoic chambers, which are closed areas, normally shielded, covered by electromagnetic absorbing material, that simulate free space propagation conditions, due to the absorption of the radiation absorbing material. Moreover, these facilities can be employed independently of the weather conditions and allow measurements free from interferences. Despite all the advantages of the anechoic chambers, the results obtained both from far-field measurements and near-field measurements are inevitably affected by errors. Thus, the main objective of this Thesis is to propose algorithms to improve the quality of the results obtained in antenna measurements by using post-processing techniques and without requiring additional measurements. First, a deep revision work of the state of the art has been made in order to give a general vision of the possibilities to characterize or to reduce the effects of errors in antenna measurements. Later, new methods to reduce the unwanted effects of four of the most commons errors in antenna measurements are described and theoretical and numerically validated. The basis of all them is the same, to perform a transformation from the measurement surface to another domain where there is enough information to easily remove the contribution of the errors. The four errors analyzed are noise, reflections, truncation errors and leakage and the tools used to suppress them are mainly source reconstruction techniques, spatial and modal filtering and iterative algorithms to extrapolate functions. Therefore, the main idea of all the methods is to modify the classical near-field-to-far-field transformations by including additional steps with which errors can be greatly suppressed. Moreover, the proposed methods are not computationally complex and, because they are applied in post-processing, additional measurements are not required. The noise is the most widely studied error in this Thesis, proposing a total of three alternatives to filter out an important noise contribution before obtaining the far-field pattern. The first one is based on a modal filtering. The second alternative uses a source reconstruction technique to obtain the extreme near-field where it is possible to apply a spatial filtering. The last one is to back-propagate the measured field to a surface with the same geometry than the measurement surface but closer to the AUT and then to apply also a spatial filtering. All the alternatives are analyzed in the three most common near-field systems, including comprehensive noise statistical analyses in order to deduce the signal-to-noise ratio improvement achieved in each case. The method to suppress reflections in antenna measurements is also based on a source reconstruction technique and the main idea is to reconstruct the field over a surface larger than the antenna aperture in order to be able to identify and later suppress the virtual sources related to the reflective waves. The truncation error presents in the results obtained from planar, cylindrical and partial spherical near-field measurements is the third error analyzed in this Thesis. The method to reduce this error is based on an iterative algorithm to extrapolate the reliable region of the far-field pattern from the knowledge of the field distribution on the AUT plane. The proper termination point of this iterative algorithm as well as other critical aspects of the method are also studied. The last part of this work is dedicated to the detection and suppression of the two most common leakage sources in antenna measurements. A first method tries to estimate the leakage bias constant added by the receiver’s quadrature detector to every near-field data and then suppress its effect on the far-field pattern. The second method can be divided into two parts; the first one to find the position of the faulty component that radiates or receives unwanted radiation, making easier its identification within the measurement environment and its later substitution; and the second part of this method is able to computationally remove the leakage effect without requiring the substitution of the faulty component. Resumen Una tarea importante en el diseño de una antena es llevar a cabo un análisis para averiguar las características de la antena que mejor cumple las especificaciones fijadas por la aplicación. Después de esto, se fabrica un prototipo de la antena y el siguiente paso en el proceso de diseño es comprobar si el patrón de radiación difiere del diseñado. Además del patrón de radiación, otros parámetros de radiación como la directividad, la ganancia, impedancia, ancho de haz, eficiencia, polarización, etc. deben ser también evaluados. Para lograr este propósito, se necesitan técnicas de medida de antenas muy precisas con el fin de saber exactamente el comportamiento electromagnético real de la antena bajo prueba. Debido a esto, la mayoría de las medidas se realizan en cámaras anecoicas, que son áreas cerradas, normalmente revestidas, cubiertas con material absorbente electromagnético. Además, estas instalaciones se pueden emplear independientemente de las condiciones climatológicas y permiten realizar medidas libres de interferencias. A pesar de todas las ventajas de las cámaras anecoicas, los resultados obtenidos tanto en medidas en campo lejano como en medidas en campo próximo están inevitablemente afectados por errores. Así, el principal objetivo de esta Tesis es proponer algoritmos para mejorar la calidad de los resultados obtenidos en medida de antenas mediante el uso de técnicas de post-procesado. Primeramente, se ha realizado un profundo trabajo de revisión del estado del arte con el fin de dar una visión general de las posibilidades para caracterizar o reducir los efectos de errores en medida de antenas. Después, se han descrito y validado tanto teórica como numéricamente nuevos métodos para reducir el efecto indeseado de cuatro de los errores más comunes en medida de antenas. La base de todos ellos es la misma, realizar una transformación de la superficie de medida a otro dominio donde hay suficiente información para eliminar fácilmente la contribución de los errores. Los cuatro errores analizados son ruido, reflexiones, errores de truncamiento y leakage y las herramientas usadas para suprimirlos son principalmente técnicas de reconstrucción de fuentes, filtrado espacial y modal y algoritmos iterativos para extrapolar funciones. Por lo tanto, la principal idea de todos los métodos es modificar las transformaciones clásicas de campo cercano a campo lejano incluyendo pasos adicionales con los que los errores pueden ser enormemente suprimidos. Además, los métodos propuestos no son computacionalmente complejos y dado que se aplican en post-procesado, no se necesitan medidas adicionales. El ruido es el error más ampliamente estudiado en esta Tesis, proponiéndose un total de tres alternativas para filtrar una importante contribución de ruido antes de obtener el patrón de campo lejano. La primera está basada en un filtrado modal. La segunda alternativa usa una técnica de reconstrucción de fuentes para obtener el campo sobre el plano de la antena donde es posible aplicar un filtrado espacial. La última es propagar el campo medido a una superficie con la misma geometría que la superficie de medida pero más próxima a la antena y luego aplicar también un filtrado espacial. Todas las alternativas han sido analizadas en los sistemas de campo próximos más comunes, incluyendo detallados análisis estadísticos del ruido con el fin de deducir la mejora de la relación señal a ruido lograda en cada caso. El método para suprimir reflexiones en medida de antenas está también basado en una técnica de reconstrucción de fuentes y la principal idea es reconstruir el campo sobre una superficie mayor que la apertura de la antena con el fin de ser capaces de identificar y después suprimir fuentes virtuales relacionadas con las ondas reflejadas. El error de truncamiento que aparece en los resultados obtenidos a partir de medidas en un plano, cilindro o en la porción de una esfera es el tercer error analizado en esta Tesis. El método para reducir este error está basado en un algoritmo iterativo para extrapolar la región fiable del patrón de campo lejano a partir de información de la distribución del campo sobre el plano de la antena. Además, se ha estudiado el punto apropiado de terminación de este algoritmo iterativo así como otros aspectos críticos del método. La última parte de este trabajo está dedicado a la detección y supresión de dos de las fuentes de leakage más comunes en medida de antenas. El primer método intenta realizar una estimación de la constante de fuga del leakage añadido por el detector en cuadratura del receptor a todos los datos en campo próximo y después suprimir su efecto en el patrón de campo lejano. El segundo método se puede dividir en dos partes; la primera de ellas para encontrar la posición de elementos defectuosos que radian o reciben radiación indeseada, haciendo más fácil su identificación dentro del entorno de medida y su posterior substitución. La segunda parte del método es capaz de eliminar computacionalmente el efector del leakage sin necesidad de la substitución del elemento defectuoso.
Resumo:
The objective of this study was to propose a multi-criteria optimization and decision-making technique to solve food engineering problems. This technique was demostrated using experimental data obtained on osmotic dehydratation of carrot cubes in a sodium chloride solution. The Aggregating Functions Approach, the Adaptive Random Search Algorithm, and the Penalty Functions Approach were used in this study to compute the initial set of non-dominated or Pareto-optimal solutions. Multiple non-linear regression analysis was performed on a set of experimental data in order to obtain particular multi-objective functions (responses), namely water loss, solute gain, rehydration ratio, three different colour criteria of rehydrated product, and sensory evaluation (organoleptic quality). Two multi-criteria decision-making approaches, the Analytic Hierarchy Process (AHP) and the Tabular Method (TM), were used simultaneously to choose the best alternative among the set of non-dominated solutions. The multi-criteria optimization and decision-making technique proposed in this study can facilitate the assessment of criteria weights, giving rise to a fairer, more consistent, and adequate final compromised solution or food process. This technique can be useful to food scientists in research and education, as well as to engineers involved in the improvement of a variety of food engineering processes.
Resumo:
The different theoretical models related with storm wave characterization focus on determining the significant wave height of the peak storm, the mean period and, usually assuming a triangle storm shape, their duration. In some cases, the main direction is also considered. Nevertheless, definition of the whole storm history, including the variation of the main random variables during the storm cycle is not taken into consideration. The representativeness of the proposed storm models, analysed in a recent study using an empirical maximum energy flux time dependent function shows that the behaviour of the different storm models is extremely dependent on the climatic characteristics of the project area. Moreover, there are no theoretical models able to adequately reproduce storm history evolution of the sea states characterized by important swell components. To overcome this shortcoming, several theoretical storm shapes are investigated taking into consideration the bases of the three best theoretical storm models, the Equivalent Magnitude Storm (EMS), the Equivalent Number of Waves Storm (ENWS) and the Equivalent Duration Storm (EDS) models. To analyse the representativeness of the new storm shape, the aforementioned maximum energy flux formulation and a wave overtopping discharge structure function are used. With the empirical energy flux formulation, correctness of the different approaches is focussed on the progressive hydraulic stability loss of the main armour layer caused by real and theoretical storms. For the overtopping structure equation, the total volume of discharge is considered. In all cases, the results obtained highlight the greater representativeness of the triangular EMS model for sea waves and the trapezoidal (nonparallel sides) EMS model for waves with a higher degree of wave development. Taking into account the increase in offshore and shallow water wind turbines, maritime transport and deep vertical breakwaters, the maximum wave height of the whole storm history and that corresponding to each sea state belonging to its cycle's evolution is also considered. The procedure considers the information usually available for extreme waves' characterization. Extrapolations of the maximum wave height of the selected storms have also been considered. The 4th order statistics of the sea state belonging to the real and theoretical storm have been estimated to complete the statistical analysis of individual wave height
Resumo:
The Daphniphyllum alkaloids are a group of highly complex polycyclic alkaloids. Examination of the structures if several members of this family of natural products led to a hypothesis about their mode of biosynthesis (depicted in Scheme SI). Based on this hypothetical biosynthetic pathway, a laboratory synthesis was designed that incorporated as a key transformation the novel one-pot transformation of dialdehyde 24 to pentacyclic unsaturated amine 25. This process turned out to be an exceptionally efficient way to construct the pentacyclic nucleus of the Daphniphyllum alkaloids. However, a purely fortuitous discovery, resulting from accidental use of methylamine rather than ammonia, led to a great improvement in the synthesis and suggests an even more attractive possible biosynthesis.
Resumo:
Objective: To investigate whether the recently developed (statistically derived) "ASsessment in Ankylosing Spondylitis Working Group" improvement criteria (ASAS-IC) for ankylosing spondylitis (AS) reflect clinically relevant improvement according to the opinion of an expert panel. Methods: The ASAS-IC consist of four domains: physical function, spinal pain, patient global assessment, and inflammation. Scores on these four domains of 55 patients with AS, who had participated in a non-steroidal anti-inflammatory drug efficacy trial, were presented to an international expert panel (consisting of patients with AS and members of the ASAS Working Group) in a three round Delphi exercise. The number of (non-) responders according to the ASAS-IC was compared with the final-consensus of the experts. The most important domains in the opinion of the experts were identified, and also selected with discriminant analysis. A number of provisional criteria sets that best represented the consensus of the experts were defined. Using other datasets, these clinically derived criteria sets as well as the statistically derived ASAS-IC were then tested for discriminative properties and for agreement with the end of trial efficacy by patient and doctor. Results: Forty experts completed the three Delphi rounds. The experts considered twice as many patients to be responders than the ASAS-IC (42 v 21). Overall agreement between experts and ASAS-IC was 62%. Spinal pain was considered the most important domain by most experts and was also selected as such by discriminant analysis. Provisional criteria sets with an agreement of greater than or equal to 80% compared with the consensus of the experts showed high placebo response rates (27-42%), in contrast with the ASAS-IC with a predefined placebo response rate of 25%. All criteria sets and the ASAS-IC discriminated well between active and placebo treatment (chi(2) = 36-45; p < 0.001). Compared with the end of trial efficacy assessment, the provisional criteria sets showed an agreement of 71-82%, sensitivity of 67-83%, and specificity of 81-88%. The ASAS-IC showed an agreement of 70%, sensitivity of 62%, and specificity of 89%. Conclusion: The ASAS-IC are strict in defining response, are highly specific, and consequently show lower sensitivity than the clinically derived criteria sets. However, those patients who are considered as responders by applying the ASAS-IC are acknowledged as such by the expert panel as well as by. patients' and doctors' judgments, and are therefore likely to be true responders.
Resumo:
This sustained longitudinal study, carried out in a single local authority, investigates the implementation of a Total Quality Management (TQM) philosophy in professional local government services. At the start of this research, a large majority of what was written about TQM was polemical and based on limited empirical evidence. This thesis seeks to provide a significant and important piece of work, making a considerable contribution to the current state of knowledge in this area. Teams from four professional services within a single local authority participated in this research, providing the main evidence on how the quality management agenda in a local authority can be successfully implemented. To supplement this rich source of data, various other sources and methods of data collection have been used: 1) Interviews were carried out with senior managers from within the authority; 2) Customer focus groups and questionnaires were used; 3) Interviews were carried out with other organisations, all of which were proponents of a TQM philosophy. A number of tools have been developed to assist in gathering data: 1) The CSFs (critical success factors) benchmarking tool; 2) Five Stages of Quality Improvement Model. A Best Practice Quality Improvement Model, arising from an analysis of the literature and the researcher's own experience is proposed and tested. From the results a number of significant conclusions have been drawn relating to: 1) Triggers for change; 2) Resistance of local government professionals to change 3) Critical success factors and barriers to quality improvement in professional local government services; 4) The problems associated with participant observation and other methodological issues used.
Resumo:
This chapter provides information on the use of Performance Improvement Management Software (PIMDEA). This advanced DEA software enables users to make the best possible analysis of the data, using the latest theoretical developments in Data Envelopment Analysis (DEA). PIM-DEA software gives full capacity to assess efficiency and productivity, set targets, identify benchmarks, and much more, allowing users to truly manage the performance of organizational units. PIM-DEA is easy to use and powerful, and it has an extensive range of the most up-to-date DEA models and which can handle large sets of data.
Resumo:
The Mara River in East Africa is currently experiencing poor water quality and increased fluctuations in seasonal flow. This study investigated technically effective and economically viable Best Management Practices for adoption in the Mara River Basin of Kenya that can stop further water resources degradation. A survey of 155 farmers was conducted in the upper catchment of the Kenyan side of the river basin. Farmers provided their assessment of BMPs that would best suit their farm in terms of water quality improvement, economic feasibility, and technicalsuitability. Cost data on different practices from farmers and published literature was collected. The results indicated that erosion control structures and runoff management practices were most suitable for adoption. The study estimated the total area that would be improved to restore water quality and reduce further water resources degradation. Farmers were found to incur losses from adopting new practices and would therefore require monetary support.
Resumo:
Software engineering best practices allow significantly improving the software development. However, the implementation of best practices requires skilled professionals, financial investment and technical support to facilitate implementation and achieve the respective improvement. In this paper we proposes a protocol to design techniques to implement best practices of software engineering. The protocol includes the identification and selection of process to improve, the study of standards and models, identification of best practices associated with the process and the possible implementation techniques. In addition, technical design activities are defined in order to create or adapt the techniques of implementing best practices for software development.
Resumo:
Este trabalho tem como objetivo melhorar a técnica de cultura em lâmina para ser usada na avaliação da viabilidade de leveduras sob diferentes condições fisiológicas. Inicialmente, foram otimizadas as condições ideais para o cultivo em lâmina de uma estirpe laboratorial (BY4741) e de uma estirpe industrial (NCYC 1214) da levedura Saccharomyces cerevisiae. O melhor protocolo foi obtido utilizando: YEPD agar com uma espessura de cerca de 2 mm; 20 μL de uma suspensão de 1 x 105 células/mL para a estirpe BY4741 ou de 5 x 104 células/mL para a estirpe NCYC 1214; uma câmara de humedecimento com 100 μL de água desionizada e um tempo de incubação de 24 h, a 25 ° C. Com o objetivo de facilitar a contagem das microcolónias, foi adicionado um corante (calcofluor white, CFW) ao meio YEPD agar. Ensaios preliminares, em YEPD líquido, contendo diferentes concentrações de CFW, permitiram verificar que o corante, até 5,0 μg/L, não inibe o crescimento da levedura. Uma concentração de 2,5 μg/L de CFW permitiu a coloração da parede das leveduras, não se observando células com morfologia alterada, sendo esta a concentração de CFW selecionado nos estudos subsequentes. A técnica de cultura em lâmina, com ou sem CFW, foi aplicada para avaliar a viabilidade de células saudáveis (células em fase exponencial de crescimento), células submetidas a stress de etanol [células expostas a 20% (v/v) de etanol, a 25 ºC, durante 2 h] e células envelhecidas (células incubadas em água, a 25 ° C, durante 48 h), da estirpe laboratorial. A percentagem de células viáveis não foi significativamente diferente entre as duas técnicas (com ou sem CFW), após uma incubação de 24 horas. Finalmente, a técnica de cultura de lâmina, contendo CFW, foi comparada com duas técnicas habitualmente usadas na indústria cervejeira: fermentação de curta duração e determinação da percentagem de células gemuladas. Os resultados obtidos através da técnica de cultura de lâmina, desenvolvida, seguem um padrão similar aos obtidos nos ensaios de fermentação de curta duração e aos da determinação da percentagem de células gemuladas. Os resultados obtidos sugerem que a técnica de cultura em lâmina, combinada com CFW, parece ser uma alternativa, fácil, rápida (em 24 h) e reprodutível, relativamente ao método convencional (técnica de plaqueamento), para a avaliação da viabilidade de células de levedura. Deverá ser realizado trabalho adicional a fim de validar o método com estirpes industriais.