992 resultados para analytical hierachy processing


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In order to improve and continuously develop the quality of pharmaceutical products, the process analytical technology (PAT) framework has been adopted by the US Food and Drug Administration. One of the aims of PAT is to identify critical process parameters and their effect on the quality of the final product. Real time analysis of the process data enables better control of the processes to obtain a high quality product. The main purpose of this work was to monitor crucial pharmaceutical unit operations (from blending to coating) and to examine the effect of processing on solid-state transformations and physical properties. The tools used were near-infrared (NIR) and Raman spectroscopy combined with multivariate data analysis, as well as X-ray powder diffraction (XRPD) and terahertz pulsed imaging (TPI). To detect process-induced transformations in active pharmaceutical ingredients (APIs), samples were taken after blending, granulation, extrusion, spheronisation, and drying. These samples were monitored by XRPD, Raman, and NIR spectroscopy showing hydrate formation in the case of theophylline and nitrofurantoin. For erythromycin dihydrate formation of the isomorphic dehydrate was critical. Thus, the main focus was on the drying process. NIR spectroscopy was applied in-line during a fluid-bed drying process. Multivariate data analysis (principal component analysis) enabled detection of the dehydrate formation at temperatures above 45°C. Furthermore, a small-scale rotating plate device was tested to provide an insight into film coating. The process was monitored using NIR spectroscopy. A calibration model, using partial least squares regression, was set up and applied to data obtained by in-line NIR measurements of a coating drum process. The predicted coating thickness agreed with the measured coating thickness. For investigating the quality of film coatings TPI was used to create a 3-D image of a coated tablet. With this technique it was possible to determine coating layer thickness, distribution, reproducibility, and uniformity. In addition, it was possible to localise defects of either the coating or the tablet. It can be concluded from this work that the applied techniques increased the understanding of physico-chemical properties of drugs and drug products during and after processing. They additionally provided useful information to improve and verify the quality of pharmaceutical dosage forms

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A conceptual information system consists of a database together with conceptual hierarchies. The management system TOSCANA visualizes arbitrary combinations of conceptual hierarchies by nested line diagrams and allows an on-line interaction with a database to analyze data conceptually. The paper describes the conception of conceptual information systems and discusses the use of their visualization techniques for on-line analytical processing (OLAP).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

About ten years ago, triadic contexts were presented by Lehmann and Wille as an extension of Formal Concept Analysis. However, they have rarely been used up to now, which may be due to the rather complex structure of the resulting diagrams. In this paper, we go one step back and discuss how traditional line diagrams of standard (dyadic) concept lattices can be used for exploring and navigating triadic data. Our approach is inspired by the slice & dice paradigm of On-Line-Analytical Processing (OLAP). We recall the basic ideas of OLAP, and show how they may be transferred to triadic contexts. For modeling the navigation patterns a user might follow, we use the formalisms of finite state machines. In order to present the benefits of our model, we show how it can be used for navigating the IT Baseline Protection Manual of the German Federal Office for Information Security.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Current commercial and academic OLAP tools do not process XML data that contains XLink. Aiming at overcoming this issue, this paper proposes an analytical system composed by LMDQL, an analytical query language. Also, the XLDM metamodel is given to model cubes of XML documents with XLink and to deal with syntactic, semantic and structural heterogeneities commonly found in XML documents. As current W3C query languages for navigating in XML documents do not support XLink, XLPath is discussed in this article to provide features for the LMDQL query processing. A prototype system enabling the analytical processing of XML documents that use XLink is also detailed. This prototype includes a driver, named sql2xquery, which performs the mapping of SQL queries into XQuery. To validate the proposed system, a case study and its performance evaluation are presented to analyze the impact of analytical processing over XML/XLink documents.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Parallel processing is prevalent in many manufacturing and service systems. Many manufactured products are built and assembled from several components fabricated in parallel lines. An example of this manufacturing system configuration is observed at a manufacturing facility equipped to assemble and test web servers. Characteristics of a typical web server assembly line are: multiple products, job circulation, and paralleling processing. The primary objective of this research was to develop analytical approximations to predict performance measures of manufacturing systems with job failures and parallel processing. The analytical formulations extend previous queueing models used in assembly manufacturing systems in that they can handle serial and different configurations of paralleling processing with multiple product classes, and job circulation due to random part failures. In addition, appropriate correction terms via regression analysis were added to the approximations in order to minimize the gap in the error between the analytical approximation and the simulation models. Markovian and general type manufacturing systems, with multiple product classes, job circulation due to failures, and fork and join systems to model parallel processing were studied. In the Markovian and general case, the approximations without correction terms performed quite well for one and two product problem instances. However, it was observed that the flow time error increased as the number of products and net traffic intensity increased. Therefore, correction terms for single and fork-join stations were developed via regression analysis to deal with more than two products. The numerical comparisons showed that the approximations perform remarkably well when the corrections factors were used in the approximations. In general, the average flow time error was reduced from 38.19% to 5.59% in the Markovian case, and from 26.39% to 7.23% in the general case. All the equations stated in the analytical formulations were implemented as a set of Matlab scripts. By using this set, operations managers of web server assembly lines, manufacturing or other service systems with similar characteristics can estimate different system performance measures, and make judicious decisions - especially setting delivery due dates, capacity planning, and bottleneck mitigation, among others.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diagnostics of mechanical components operating in transient conditions is still an open issue, in both research and industrial field. Indeed, the signal processing techniques developed to analyse stationary data are not applicable or are affected by a loss of effectiveness when applied to signal acquired in transient conditions. In this paper, a suitable and original signal processing tool (named EEMED), which can be used for mechanical component diagnostics in whatever operating condition and noise level, is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED) and the analytical approach of the Hilbert transform. The proposed tool is able to supply diagnostic information on the basis of experimental vibrations measured in transient conditions. The tool has been originally developed in order to detect localized faults on bearings installed in high speed train traction equipments and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on spectral kurtosis or envelope analysis, which represent until now the landmark for bearings diagnostics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The signal processing techniques developed for the diagnostics of mechanical components operating in stationary conditions are often not applicable or are affected by a loss of effectiveness when applied to signals measured in transient conditions. In this chapter, an original signal processing tool is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition, Minimum Entropy Deconvolution and the analytical approach of the Hilbert transform. The tool has been developed to detect localized faults on bearings of traction systems of high speed trains and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on envelope analysis or spectral kurtosis, which represent until now the landmark for bearings diagnostics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rectangular dielectric waveguide is the most commonly used structure in integrated optics, especially in semi-conductor diode lasers. Demands for new applications such as high-speed data backplanes in integrated electronics, waveguide filters, optical multiplexers and optical switches are driving technology toward better materials and processing techniques for planar waveguide structures. The infinite slab and circular waveguides that we know are not practical for use on a substrate because the slab waveguide has no lateral confinement and the circular fiber is not compatible with the planar processing technology being used to make planar structures. The rectangular waveguide is the natural structure. In this review, we have discussed several analytical methods for analyzing the mode structure of rectangular structures, beginning with a wave analysis based on the pioneering work of Marcatili. We study three basic techniques with examples to compare their performance levels. These are the analytical approach developed by Marcatili, the perturbation techniques, which improve on the analytical solutions and the effective index method with examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research undertaken here was in response to a decision by a major food producer in about 2009 to consider establishing processing tomato production in northern Australia. This was in response to a lack of water availability in the Goulburn Valley region following the extensive drought that continued until 2011. The high price of water and the uncertainty that went with it was important in making the decision to look at sites within Queensland. This presented an opportunity to develop a tomato production model for the varieties used in the processing industry and to use this as a case study along with rice and cotton production. Following some unsuccessful early trials and difficulties associated with the Global Financial Crisis, large scale studies by the food producer were abandoned. This report uses the data that was collected prior to this decision and contrasts the use of crop modelling with simpler climatic analyses that can be undertaken to investigate the impact of climate change on production systems. Crop modelling can make a significant contribution to our understanding of the impacts of climate variability and climate change because it harnesses the detailed understanding of physiology of the crop in a way that statistical or other analytical approaches cannot do. There is a high overhead, but given that trials are being conducted for a wide range of crops for a variety of purposes, breeding, fertiliser trials etc., it would appear to be profitable to link researchers with modelling expertise with those undertaking field trials. There are few more cost-effective approaches than modelling that can provide a pathway to understanding future climates and their impact on food production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The average daily intake of folate, one of the B vitamins, falls below recommendations among the Finnish population. Bread and cereals are the main sources of folate, rye being the most significant single source. Processing is a prerequisite for the consumption of whole grain rye; however, little is known about the effect of processing on folates. Moreover, data on the bioavailability of endogenous cereal folates are scarce. The aim of this study was to examine the variation in as well as the effect of fermentation, germination, and thermal processes on folate contents in rye. Bioavailability of endogenous rye folates was investigated in a four-week human intervention study. One of the objectives throughout the work was to optimise and evaluate analytical methods for determining folate contents in cereals. Affinity chromatographic purification followed by high-performance liquid chromatography (HPLC) was a suitable method for analysing cereal products for folate vitamers, and microbiological assay with Lactobacillus rhamnosus reliably quantified the total folate. However, HPLC gave approximately 30% lower results than the microbiological assay. The folate content of rye was high and could be further increased by targeted processing. The vitamer distribution of whole grain rye was characterised by a large proportion of formylated vitamers followed by 5-methyltetrahydrofolate. In sourdough fermentation of rye, the studied yeasts synthesized and lactic acid bacteria mainly depleted folate. Two endogenous bacteria isolated from rye flour were found to produce folate during fermentation. Inclusion of baker s yeast in sourdough fermentation raised the folate level so that the bread could contain more folate than the flour it was made of. Germination markedly increased the folate content of rye, with particularly high folate concentrations in hypocotylar roots. Thermal treatments caused significant folate losses but the preceding germination compensated well for the losses. In the bioavailability study, moderate amounts of endogenous folates in the form of different rye products and orange juice incorporated in the diet improved the folate status among healthy adults. Endogenous folates from rye and orange juice showed similar bioavailability to folic acid from fortified white bread. In brief, it was shown that the folate content of rye can be enhanced manifold by optimising and combining food processing techniques. This offers some practical means to increase the daily intake of folate in a bioavailable form.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reverse osmosis (RO) brine produced at a full-scale coal seam gas (CSG) water treatment facility was characterized with spectroscopic and other analytical techniques. A number of potential scalants including silica, calcium, magnesium, sulphates and carbonates, all of which were present in dissolved and non-dissolved forms, were characterized. The presence of spherical particles with a size range of 10–1000 nm and aggregates of 1–10 microns was confirmed by transmission electron microscopy (TEM). Those particulates contained the following metals in decreasing order: K, Si, Sr, Ca, B, Ba, Mg, P, and S. Characterization showed that nearly one-third of the total silicon in the brine was present in the particulates. Further, analysis of the RO brine suggested supersaturation and precipitation of metal carbonates and sulphates during the RO process should take place and could be responsible for subsequently capturing silica in the solid phase. However, the precipitation of crystalline carbonates and sulphates are complex. X-ray diffraction analysis did not confirm the presence of common calcium carbonates or sulphates but instead showed the presence of a suite of complex minerals, to which amorphous silica and/or silica rich compounds could have adhered. A filtration study showed that majority of the siliceous particles were less than 220 nm in size, but could still be potentially captured using a low molecular weight ultrafiltration membrane.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiresolution synthetic aperture radar (SAR) image formation has been proven to be beneficial in a variety of applications such as improved imaging and target detection as well as speckle reduction. SAR signal processing traditionally carried out in the Fourier domain has inherent limitations in the context of image formation at hierarchical scales. We present a generalized approach to the formation of multiresolution SAR images using biorthogonal shift-invariant discrete wavelet transform (SIDWT) in both range and azimuth directions. Particularly in azimuth, the inherent subband decomposition property of wavelet packet transform is introduced to produce multiscale complex matched filtering without involving any approximations. This generalized approach also includes the formulation of multilook processing within the discrete wavelet transform (DWT) paradigm. The efficiency of the algorithm in parallel form of execution to generate hierarchical scale SAR images is shown. Analytical results and sample imagery of diffuse backscatter are presented to validate the method.