980 resultados para Engineering problems


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, some research groups have been developing studies aiming to apply spouted beds of inert particles for production of dried herbal extracts. However, mainly due to their complex composition, several problems arise during the spouted bed drying of herbal extracts such as bed instability, product accumulation, particle agglomeration, and bed collapse. The addition of drying carriers, like colloidal silicon dioxide, to the extractive solution can minimize these unwanted effects. The aim of this work was to study the influence of the addition of colloidal silicon dioxide on enhancement of the performance of the drying of hydroalcoholic extract of Bauhinia forficata Link on a spouted bed of inert particles. The physical properties of the herbal extract and of its mixture with colloidal silicon dioxide at several concentrations (20% to 80% related to solids content) were quantified by determination of the surface tension, rheological properties, density, pH, and contact angles with the inert surfaces. Drying performance was evaluated through determination of the elutriation ratio, product recovery ratio, and product accumulation. The product was characterized through determination of the thermal degradation of bioactive compounds and product moisture content. The results indicated that the rheological properties of the extracts and their preparations, the contact angle with inert material, and the work of adhesion play important roles in the spouted bed drying of herbal extracts. Higher concentration of the drying carrier significantly improved the spouted bed drying performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cyclodextrins (CDs) are annular oligosaccharides containing 6-12 glucose unities joined together by alpha-1,4 bonds. They have a conical-truncated shape with a lipophilic cavity in which different molecules can be included resulting in a stable inclusion complex. The cyclodextrins have been widely applied in pharmaceutical technology with the objective of increasing the solubility, stability and bioavailability of drugs in different pharmaceutical dosage forms, such as tablets. In order to obtain beta-CD tablets, liquid dispersions of drug/beta-CD are usually submitted to different drying processes, like spray-drying, freeze-drying or slow evaporation, being this dry material added to a number of excipients. However, such drying processes can generate particulate materials showing problems of flow and compressibility, needing their conversion into granulates by means of wetting with granulation liquid followed by additional drying. In this work, the main objective was to evaluate the preparation of tablets without the need of this additional drying step. For this purpose an aqueous dispersion containing acetaminophen/beta-CD complex and cornstarch was dried using a spouted bed and the obtained granules were compressed in tablets. Acetaminophen was used as model drug due to its low water solubility and the inexpensive and widely available cornstarch was chosen as excipient. Acetaminophen powder was added into a beta-cyclodextrin solution prepared in distilled water at 70 degrees C. Stirring was kept until this dispersion cooled to room temperature. Then cornstarch was added and the resulting dispersion was dried in spouted bed equipment. This material was compressed into tablets using an Erweka Korsh EKO tablet machine. This innovative approach allowed the tablets preparation process to be carried out with fewer steps and represents a technological reliable strategy to produce beta-cyclodextrin inclusion complexes tablets. (C) 2010 Elsevier By. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the main objectives of the first International Junior Researcher and Engineer Workshop on Hydraulic Structures is to provide an opportunity for young researchers and engineers to present their research. But a research project is only completed when it has been published and shared with the community. Referees and peer experts play an important role to control the research quality. While some new electronic tools provide further means to disseminate some research information, the quality and impact of the works remain linked with some thorough expert-review process and the publications in international scientific journals and books. Importantly unethical publishing standards are not acceptable and cheating is despicable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The detection of seizure in the newborn is a critical aspect of neurological research. Current automatic detection techniques are difficult to assess due to the problems associated with acquiring and labelling newborn electroencephalogram (EEG) data. A realistic model for newborn EEG would allow confident development, assessment and comparison of these detection techniques. This paper presents a model for newborn EEG that accounts for its self-similar and non-stationary nature. The model consists of background and seizure sub-models. The newborn EEG background model is based on the short-time power spectrum with a time-varying power law. The relationship between the fractal dimension and the power law of a power spectrum is utilized for accurate estimation of the short-time power law exponent. The newborn EEG seizure model is based on a well-known time-frequency signal model. This model addresses all significant time-frequency characteristics of newborn EEG seizure which include; multiple components or harmonics, piecewise linear instantaneous frequency laws and harmonic amplitude modulation. Estimates of the parameters of both models are shown to be random and are modelled using the data from a total of 500 background epochs and 204 seizure epochs. The newborn EEG background and seizure models are validated against real newborn EEG data using the correlation coefficient. The results show that the output of the proposed models has a higher correlation with real newborn EEG than currently accepted models (a 10% and 38% improvement for background and seizure models, respectively).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

P-representation techniques, which have been very successful in quantum optics and in other fields, are also useful for general bosonic quantum-dynamical many-body calculations such as Bose-Einstein condensation. We introduce a representation called the gauge P representation, which greatly widens the range of tractable problems. Our treatment results in an infinite set of possible time evolution equations, depending on arbitrary gauge functions that can be optimized for a given quantum system. In some cases, previous methods can give erroneous results, due to the usual assumption of vanishing boundary conditions being invalid for those particular systems. Solutions are given to this boundary-term problem for all the cases where it is known to occur: two-photon absorption and the single-mode laser. We also provide some brief guidelines on how to apply the stochastic gauge method to other systems in general, quantify the freedom of choice in the resulting equations, and make a comparison to related recent developments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bacharuddin Jusuf Habibie, even now, remains a deeply puzzling figure for scholars and commentators concerned with matters Indonesian, Western and Indonesian alike.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the last three decades, the engineering higher education and professional environments have been completely transformed by the "electronic/digital information revolution" that has included the introduction of personal computer, the development of email and world wide web, and broadband Internet connections at home. Herein the writer compares the performances of several digital tools with traditional library resources. While new specialised search engines and open access digital repositories may fill a gap between conventional search engines and traditional references, these should be not be confused with real libraries and international scientific databases that encompass textbooks and peer-reviewed scholarly works. An absence of listing in some Internet search listings, databases and repositories is not an indication of standing. Researchers, engineers and academics should remember these key differences in assessing the quality of bibliographic "research" based solely upon Internet searches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The demand for more pixels is beginning to be met as manufacturers increase the native resolution of projector chips. Tiling several projectors still offers a solution to augment the pixel capacity of a display. However, problems of color and illumination uniformity across projectors need to be addressed as well as the computer software required to drive such devices. We present the results obtained on a desktop-size tiled projector array of three D-ILA projectors sharing a common illumination source. A short throw lens (0.8:1) on each projector yields a 21-in. diagonal for each image tile; the composite image on a 3×1 array is 3840×1024 pixels with a resolution of about 80 dpi. The system preserves desktop resolution, is compact, and can fit in a normal room or laboratory. The projectors are mounted on precision six-axis positioners, which allow pixel level alignment. A fiber optic beamsplitting system and a single set of red, green, and blue dichroic filters are the key to color and illumination uniformity. The D-ILA chips inside each projector can be adjusted separately to set or change characteristics such as contrast, brightness, or gamma curves. The projectors were then matched carefully: photometric variations were corrected, leading to a seamless image. Photometric measurements were performed to characterize the display and are reported here. This system is driven by a small PC cluster fitted with graphics cards and running Linux. It can be scaled to accommodate an array of 2×3 or 3×3 projectors, thus increasing the number of pixels of the final image. Finally, we present current uses of the display in fields such as astrophysics and archaeology (remote sensing).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We give conditions on f involving pairs of lower and upper solutions which lead to the existence of at least three solutions of the two point boundary value problem y" + f(x, y, y') = 0, x epsilon [0, 1], y(0) = 0 = y(1). In the special case f(x, y, y') = f(y) greater than or equal to 0 we give growth conditions on f and apply our general result to show the existence of three positive solutions. We give an example showing this latter result is sharp. Our results extend those of Avery and of Lakshmikantham et al.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We are concerned with determining values of, for which there exist nodal solutions of the boundary value problems u" + ra(t) f(u) = 0, 0 < t < 1, u(O) = u(1) = 0. The proof of our main result is based upon bifurcation techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This was an early pre-Catalyst collaboration about developing reflexivity in student engineers. It was funded by (then) CUTSD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For centuries, hydraulic engineers were at the forefront of science. The last forty years marked a change of perception in our society with a focus on environmental sustainability and management, particularly in developed countries. Herein, the writer illustrates his strong belief that the future of hydraulic engineering lies upon a combination of innovative engineering, research excellence and higher education of quality. This drive continues a long tradition established by eminent scholars like Arthur Thomas IPPEN, John Fisher KENNEDY and Hunter ROUSE.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Data mining is the process to identify valid, implicit, previously unknown, potentially useful and understandable information from large databases. It is an important step in the process of knowledge discovery in databases, (Olaru & Wehenkel, 1999). In a data mining process, input data can be structured, seme-structured, or unstructured. Data can be in text, categorical or numerical values. One of the important characteristics of data mining is its ability to deal data with large volume, distributed, time variant, noisy, and high dimensionality. A large number of data mining algorithms have been developed for different applications. For example, association rules mining can be useful for market basket problems, clustering algorithms can be used to discover trends in unsupervised learning problems, classification algorithms can be applied in decision-making problems, and sequential and time series mining algorithms can be used in predicting events, fault detection, and other supervised learning problems (Vapnik, 1999). Classification is among the most important tasks in the data mining, particularly for data mining applications into engineering fields. Together with regression, classification is mainly for predictive modelling. So far, there have been a number of classification algorithms in practice. According to (Sebastiani, 2002), the main classification algorithms can be categorized as: decision tree and rule based approach such as C4.5 (Quinlan, 1996); probability methods such as Bayesian classifier (Lewis, 1998); on-line methods such as Winnow (Littlestone, 1988) and CVFDT (Hulten 2001), neural networks methods (Rumelhart, Hinton & Wiliams, 1986); example-based methods such as k-nearest neighbors (Duda & Hart, 1973), and SVM (Cortes & Vapnik, 1995). Other important techniques for classification tasks include Associative Classification (Liu et al, 1998) and Ensemble Classification (Tumer, 1996).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The BR algorithm is a novel and efficient method to find all eigenvalues of upper Hessenberg matrices and has never been applied to eigenanalysis for power system small signal stability. This paper analyzes differences between the BR and the QR algorithms with performance comparison in terms of CPU time based on stopping criteria and storage requirement. The BR algorithm utilizes accelerating strategies to improve its performance when computing eigenvalues of narrowly banded, nearly tridiagonal upper Hessenberg matrices. These strategies significantly reduce the computation time at a reasonable level of precision. Compared with the QR algorithm, the BR algorithm requires fewer iteration steps and less storage space without depriving of appropriate precision in solving eigenvalue problems of large-scale power systems. Numerical examples demonstrate the efficiency of the BR algorithm in pursuing eigenanalysis tasks of 39-, 68-, 115-, 300-, and 600-bus systems. Experiment results suggest that the BR algorithm is a more efficient algorithm for large-scale power system small signal stability eigenanalysis.