949 resultados para analytical approaches


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Regulatory authorities in many countries, in order to maintain an acceptable balance between appropriate customer service qualities and costs, are introducing a performance-based regulation. These regulations impose penalties-and, in some cases, rewards-that introduce a component of financial risk to an electric power utility due to the uncertainty associated with preserving a specific level of system reliability. In Brazil, for instance, one of the reliability indices receiving special attention by the utilities is the maximum continuous interruption duration (MCID) per customer.This parameter is responsible for the majority of penalties in many electric distribution utilities. This paper describes analytical and Monte Carlo simulation approaches to evaluate probability distributions of interruption duration indices. More emphasis will be given to the development of an analytical method to assess the probability distribution associated with the parameter MCID and the correspond ng penalties. Case studies on a simple distribution network and on a real Brazilian distribution system are presented and discussed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study evaluated alternatives for producing erosion susceptibility maps, considering different weight combinations for an environment's attributes, according to four different points of views. The attributes considered were landform, steepness, soils, rocks and land occupation. Considered alternatives were: (1) equal weights, more traditional approach, (2) different weights, according to a previous study in the area, (3) different weights, based on other works in the literature, and (4) different weights based on the analytical hierarchical process. The area studied included the Prosa Basin located in Campo Grande-Mato Grosso do Sul State, Brazil. The results showed that the assessed alternatives can be used together or in different stages of studies aiming at urban planning and decision-making on the interventions to be applied.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Lipids are important components that contribute very significantly to nutritional and technological quality of foods because they are the least stable macro-components in foods, due to high susceptibility to oxidation. When rancidity take place, it makes food unhealthy and unacceptable for consumers. Thus, the presence of antioxidants, naturally present of added to foods, is required to enhance shelf life of foods. Moreover, antioxidant like phenolic compounds play an important role in human health enhancing the functionality of foods. The aim of this PhD project was the study of lipid quality and lipid oxidation in different vegetable foods focusing on analytical and technological aspects in order to figure out the effects of lipid composition and bioactive compounds (phenolic compounds, omega-3 fatty acids and dietary fiber) addition on their shelf life. In addition, bioavailability and antioxidant effects of phenolic compounds in human and animals, respectively, were evaluated after consumption of vegetable foods. The first section of the work was focused on the evaluation of lipid quality impact on technological behaviour of vegetable foods. Because of that, cocoa butter with different melting point were evaluated by chromatographic techniques (GC, TLC) and the sample with the higher melting point showed the presence of fatty acids, triglycerides, 2-monoglycerides and FT-IR profile different from genuine cocoa butter, meaning an adding of foreign fat (lauric-fat) not allowed by the law. Looking at lipid quality of other vegetable foods, an accelerated shelf life test (OXITEST®), was used to evaluate of lipid stability to oxidation in tarallini snacks made up using different lipid matrices (sunflower oil, extravirgin olive oil and a blend of extravirgin olive oil and lard). The results showed a good ability of OXITEST® to discriminate between lipid unsaturation and different cooking times, without any samples fat extraction. In the second section, the role of bioactive compounds on cereal based food shelf life was studied in different bakeries by GC, spectrophotometric methods and capillary electrophoresis. It was examined the relationships between phenolic compounds, added with flour, and lipid oxidation of tarallini and frollini. Both products showed an increase in lipid oxidation during storage and antioxidant effects on lipid oxidation were not as expected. Furthermore, the influence of enrichment in polyunsaturated fatty acids on lipid oxidation of pasta was evaluated. The results proved that LC n-3 PUFA were not significantly implicated in the onset of oxidation in spaghetti stored under daylight and accelerated oxidation in a laboratory heater. The importance of phenolic compounds as antioxidant in humans and rats was also studied, by HPLC/MS in the latter section. For this purpose, apigenin and apigenin glycosides excretion was investigated in six women’s urine in a 24 hours study. After a single dose of steamed artichokes, both aglicone and glucuronide metabolites were recovered in 24 h urine. Moreover, the effect of whole grain durum wheat bread and whole grain Kamut® khorasan bread in rats were evaluated. Both cereals were good sources of antioxidants but Kamut® bread fed animals had a better response to stress than wheat durum fed, especially when a sourdough bread was supplied.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the past decade, several major food safety crises originated from problems with feed. Consequently, there is an urgent need for early detection of fraudulent adulteration and contamination in the feed chain. Strategies are presented for two specific cases, viz. adulterations of (i) soybean meal with melamine and other types of adulterants/contaminants and (ii) vegetable oils with mineral oil, transformer oil or other oils. These strategies comprise screening at the feed mill or port of entry with non-destructive spectroscopic methods (NIRS and Raman), followed by post-screening and confirmation in the laboratory with MS-based methods. The spectroscopic techniques are suitable for on-site and on-line applications. Currently they are suited to detect fraudulent adulteration at relatively high levels but not to detect low level contamination. The potential use of the strategies for non-targeted analysis is demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this paper is threefold. First, we propose a systemic view of communication based in autopoiesis, the theory of living systems formulated by Maturana & Varela (1980, 1987). Second, we show the links between the underpinning assumptions of autopoiesis and the sociolinguistic approaches of Halliday (1978), Fairclough (1989, 1992, 1995) and Lemke (1995, 1994). Third, we propose a theoretical and analytical synthesis of autopoiesis and sociolinguistics for the study of organisational communication. In proposing a systemic theory for organisational communication, we argue that traditional approaches to communication, information, and the role of language in human organisations have, to date, been placed in teleological constraints because of an inverted focus on organisational purpose-the generally perceived role of an organisation within society-that obscure, rather than clarify, the role of language within human organisations. We argue that human social systems are, according to the criteria defined by Maturana and Varela, third-order, non-organismic living systems constituted in language. We further propose that sociolinguistics provides an appropriate analytical tool which is both compatible and penetrating in synthesis with the systemic framework provided by an autopoietic understanding of social organisation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many industrial processes and systems can be modelled mathematically by a set of Partial Differential Equations (PDEs). Finding a solution to such a PDF model is essential for system design, simulation, and process control purpose. However, major difficulties appear when solving PDEs with singularity. Traditional numerical methods, such as finite difference, finite element, and polynomial based orthogonal collocation, not only have limitations to fully capture the process dynamics but also demand enormous computation power due to the large number of elements or mesh points for accommodation of sharp variations. To tackle this challenging problem, wavelet based approaches and high resolution methods have been recently developed with successful applications to a fixedbed adsorption column model. Our investigation has shown that recent advances in wavelet based approaches and high resolution methods have the potential to be adopted for solving more complicated dynamic system models. This chapter will highlight the successful applications of these new methods in solving complex models of simulated-moving-bed (SMB) chromatographic processes. A SMB process is a distributed parameter system and can be mathematically described by a set of partial/ordinary differential equations and algebraic equations. These equations are highly coupled; experience wave propagations with steep front, and require significant numerical effort to solve. To demonstrate the numerical computing power of the wavelet based approaches and high resolution methods, a single column chromatographic process modelled by a Transport-Dispersive-Equilibrium linear model is investigated first. Numerical solutions from the upwind-1 finite difference, wavelet-collocation, and high resolution methods are evaluated by quantitative comparisons with the analytical solution for a range of Peclet numbers. After that, the advantages of the wavelet based approaches and high resolution methods are further demonstrated through applications to a dynamic SMB model for an enantiomers separation process. This research has revealed that for a PDE system with a low Peclet number, all existing numerical methods work well, but the upwind finite difference method consumes the most time for the same degree of accuracy of the numerical solution. The high resolution method provides an accurate numerical solution for a PDE system with a medium Peclet number. The wavelet collocation method is capable of catching up steep changes in the solution, and thus can be used for solving PDE models with high singularity. For the complex SMB system models under consideration, both the wavelet based approaches and high resolution methods are good candidates in terms of computation demand and prediction accuracy on the steep front. The high resolution methods have shown better stability in achieving steady state in the specific case studied in this Chapter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three recent papers published in Chemical Engineering Journal studied the solution of a model of diffusion and nonlinear reaction using three different methods. Two of these studies obtained series solutions using specialized mathematical methods, known as the Adomian decomposition method and the homotopy analysis method. Subsequently it was shown that the solution of the same particular model could be written in terms of a transcendental function called Gauss’ hypergeometric function. These three previous approaches focused on one particular reactive transport model. This particular model ignored advective transport and considered one specific reaction term only. Here we generalize these previous approaches and develop an exact analytical solution for a general class of steady state reactive transport models that incorporate (i) combined advective and diffusive transport, and (ii) any sufficiently differentiable reaction term R(C). The new solution is a convergent Maclaurin series. The Maclaurin series solution can be derived without any specialized mathematical methods nor does it necessarily involve the computation of any transcendental function. Applying the Maclaurin series solution to certain case studies shows that the previously published solutions are particular cases of the more general solution outlined here. We also demonstrate the accuracy of the Maclaurin series solution by comparing with numerical solutions for particular cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Our paper presents the results of a meta-analytical review of street level drug law enforcement. We conducted a series of meta-analyses to compare and contrast the effectiveness of four types of drug law enforcement approaches, including community-wide policing, problem-oriented/ partnership approaches that were geographically focused, hotspots policing and standard, unfocused law enforcement efforts. We examined the relative impact of these different crime control tactics on streetlevel drug problems as well as associated problems such as property crime, disorder and violent crime. The results of the meta-analyses, together with examination of forest plots, reveal that problem-oriented policing and geographically-focused interventions involving cooperative partnerships between police and third parties tend to be more effective at controlling drug problems than community-wide policing efforts that are unfocused and spread out across a community. But geographically focused and community-wide drug law enforcement interventions that leverage partnerships are more effective at dealing with drug problems than traditional, law enforcement-only interventions. Our results suggest that the key to successful drug law enforcement lies in the capacity of the police to forge productive partnerships with third parties rather than simply increasing police presence or intervention (e.g., arrests) at drug hotspots.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"The German word for experience - Erlebnis - the experience of the life, to live through something - underpins this book: making visible scholarly opportunities for richer and deeper contextualizations and examinations of the lived-world experiences of people in everyday contexts as they be, do and become." (Ross Todd, Preface). Information experience is a burgeoning area of research and still unfolding as an explicit research and practice theme. This book is therefore very timely as it distils the reflections of researchers and practitioners from various disciplines, with interests ranging across information, knowledge, user experience, design and education. They cast a fresh analytical eye on information experience, whilst approaching the idea from diverse perspectives. Information Experience brings together current thinking about the idea of information experience to help form discourse around it and establish a conceptual foundation for taking the idea forward. It therefore "provides a number of theoretical lenses for examining people's information worlds in more holistic and dynamic ways." (Todd, Preface)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing planning theories tend to be limited in their analytical scope and often fail to account for the impact of many interactions between the multitudes of stakeholders involved in strategic planning processes. Although many theorists rejected structural–functional approaches from the 1970s, this article argues that many of structural–functional concepts remain relevant and useful to planning practitioners. In fact, structural–functional approaches are highly useful and practical when used as a foundation for systemic analysis of real-world, multi-layered, complex planning systems to support evidence-based governance reform. Such approaches provide a logical and systematic approach to the analysis of the wider governance of strategic planning systems that is grounded in systems theory and complementary to existing theories of complexity and planning. While we do not propose its use as a grand theory of planning, this article discusses how structural–functional concepts and approaches might be applied to underpin a practical analysis of the complex decision-making arrangements that drive planning practice, and to provide the evidence needed to target reform of poorly performing arrangements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fluid bed granulation is a key pharmaceutical process which improves many of the powder properties for tablet compression. Dry mixing, wetting and drying phases are included in the fluid bed granulation process. Granules of high quality can be obtained by understanding and controlling the critical process parameters by timely measurements. Physical process measurements and particle size data of a fluid bed granulator that are analysed in an integrated manner are included in process analytical technologies (PAT). Recent regulatory guidelines strongly encourage the pharmaceutical industry to apply scientific and risk management approaches to the development of a product and its manufacturing process. The aim of this study was to utilise PAT tools to increase the process understanding of fluid bed granulation and drying. Inlet air humidity levels and granulation liquid feed affect powder moisture during fluid bed granulation. Moisture influences on many process, granule and tablet qualities. The approach in this thesis was to identify sources of variation that are mainly related to moisture. The aim was to determine correlations and relationships, and utilise the PAT and design space concepts for the fluid bed granulation and drying. Monitoring the material behaviour in a fluidised bed has traditionally relied on the observational ability and experience of an operator. There has been a lack of good criteria for characterising material behaviour during spraying and drying phases, even though the entire performance of a process and end product quality are dependent on it. The granules were produced in an instrumented bench-scale Glatt WSG5 fluid bed granulator. The effect of inlet air humidity and granulation liquid feed on the temperature measurements at different locations of a fluid bed granulator system were determined. This revealed dynamic changes in the measurements and enabled finding the most optimal sites for process control. The moisture originating from the granulation liquid and inlet air affected the temperature of the mass and pressure difference over granules. Moreover, the effects of inlet air humidity and granulation liquid feed rate on granule size were evaluated and compensatory techniques used to optimize particle size. Various end-point indication techniques of drying were compared. The ∆T method, which is based on thermodynamic principles, eliminated the effects of humidity variations and resulted in the most precise estimation of the drying end-point. The influence of fluidisation behaviour on drying end-point detection was determined. The feasibility of the ∆T method and thus the similarities of end-point moisture contents were found to be dependent on the variation in fluidisation between manufacturing batches. A novel parameter that describes behaviour of material in a fluid bed was developed. Flow rate of the process air and turbine fan speed were used to calculate this parameter and it was compared to the fluidisation behaviour and the particle size results. The design space process trajectories for smooth fluidisation based on the fluidisation parameters were determined. With this design space it is possible to avoid excessive fluidisation and improper fluidisation and bed collapse. Furthermore, various process phenomena and failure modes were observed with the in-line particle size analyser. Both rapid increase and a decrease in granule size could be monitored in a timely manner. The fluidisation parameter and the pressure difference over filters were also discovered to express particle size when the granules had been formed. The various physical parameters evaluated in this thesis give valuable information of fluid bed process performance and increase the process understanding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diffusion in a composite slab consisting of a large number of layers provides an ideal prototype problem for developing and analysing two-scale modelling approaches for heterogeneous media. Numerous analytical techniques have been proposed for solving the transient diffusion equation in a one-dimensional composite slab consisting of an arbitrary number of layers. Most of these approaches, however, require the solution of a complex transcendental equation arising from a matrix determinant for the eigenvalues that is difficult to solve numerically for a large number of layers. To overcome this issue, in this paper, we present a semi-analytical method based on the Laplace transform and an orthogonal eigenfunction expansion. The proposed approach uses eigenvalues local to each layer that can be obtained either explicitly, or by solving simple transcendental equations. The semi-analytical solution is applicable to both perfect and imperfect contact at the interfaces between adjacent layers and either Dirichlet, Neumann or Robin boundary conditions at the ends of the slab. The solution approach is verified for several test cases and is shown to work well for a large number of layers. The work is concluded with an application to macroscopic modelling where the solution of a fine-scale multilayered medium consisting of two hundred layers is compared against an “up-scaled” variant of the same problem involving only ten layers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Earth s climate is a highly dynamic and complex system in which atmospheric aerosols have been increasingly recognized to play a key role. Aerosol particles affect the climate through a multitude of processes, directly by absorbing and reflecting radiation and indirectly by changing the properties of clouds. Because of the complexity, quantification of the effects of aerosols continues to be a highly uncertain science. Better understanding of the effects of aerosols requires more information on aerosol chemistry. Before the determination of aerosol chemical composition by the various available analytical techniques, aerosol particles must be reliably sampled and prepared. Indeed, sampling is one of the most challenging steps in aerosol studies, since all available sampling techniques harbor drawbacks. In this study, novel methodologies were developed for sampling and determination of the chemical composition of atmospheric aerosols. In the particle-into-liquid sampler (PILS), aerosol particles grow in saturated water vapor with further impaction and dissolution in liquid water. Once in water, the aerosol sample can then be transported and analyzed by various off-line or on-line techniques. In this study, PILS was modified and the sampling procedure was optimized to obtain less altered aerosol samples with good time resolution. A combination of denuders with different coatings was tested to adsorb gas phase compounds before PILS. Mixtures of water with alcohols were introduced to increase the solubility of aerosols. Minimum sampling time required was determined by collecting samples off-line every hour and proceeding with liquid-liquid extraction (LLE) and analysis by gas chromatography-mass spectrometry (GC-MS). The laboriousness of LLE followed by GC-MS analysis next prompted an evaluation of solid-phase extraction (SPE) for the extraction of aldehydes and acids in aerosol samples. These two compound groups are thought to be key for aerosol growth. Octadecylsilica, hydrophilic-lipophilic balance (HLB), and mixed phase anion exchange (MAX) were tested as extraction materials. MAX proved to be efficient for acids, but no tested material offered sufficient adsorption for aldehydes. Thus, PILS samples were extracted only with MAX to guarantee good results for organic acids determined by liquid chromatography-mass spectrometry (HPLC-MS). On-line coupling of SPE with HPLC-MS is relatively easy, and here on-line coupling of PILS with HPLC-MS through the SPE trap produced some interesting data on relevant acids in atmospheric aerosol samples. A completely different approach to aerosol sampling, namely, differential mobility analyzer (DMA)-assisted filter sampling, was employed in this study to provide information about the size dependent chemical composition of aerosols and understanding of the processes driving aerosol growth from nano-size clusters to climatically relevant particles (>40 nm). The DMA was set to sample particles with diameters of 50, 40, and 30 nm and aerosols were collected on teflon or quartz fiber filters. To clarify the gas-phase contribution, zero gas-phase samples were collected by switching off the DMA every other 15 minutes. Gas-phase compounds were adsorbed equally well on both types of filter, and were found to contribute significantly to the total compound mass. Gas-phase adsorption is especially significant during the collection of nanometer-size aerosols and needs always to be taken into account. Other aims of this study were to determine the oxidation products of β-caryophyllene (the major sesquiterpene in boreal forest) in aerosol particles. Since reference compounds are needed for verification of the accuracy of analytical measurements, three oxidation products of β-caryophyllene were synthesized: β-caryophyllene aldehyde, β-nocaryophyllene aldehyde, and β-caryophyllinic acid. All three were identified for the first time in ambient aerosol samples, at relatively high concentrations, and their contribution to the aerosol mass (and probably growth) was concluded to be significant. Methodological and instrumental developments presented in this work enable fuller understanding of the processes behind biogenic aerosol formation and provide new tools for more precise determination of biosphere-atmosphere interactions.