16 resultados para Link variables method
Resumo:
The effects of the process variables, pH of aqueous phase, rate of addition of organic, polymeric, drug-containing phase to aqueous phase, organic:aqueous phase volume ratio and aqueous phase temperature on the entrapment of propranolol hydrochloride in ethylcellulose (N4) microspheres prepared by the solvent evaporation method were examined using a factorial design. The observed range of drug entrapment was 1.43 +/- 0.02%w/w (pH 6, 25 degrees C, phase volume ratio 1:10, fast rate of addition) to 16.63 +/- 0.92%w/w (pH 9, 33 degrees C, phase volume ratio 1:10, slow rate of addition) which corresponded to mean entrapment efficiencies of 2.86 and 33.26, respectively. Increased pH, increased temperature and decreased rate of addition significantly enhanced entrapment efficiency. However, organic:aqueous phase volume ratio did not significantly affect drug entrapment. Statistical interactions were observed between pH and rate of addition, pH and temperature, and temperature and rate of addition. The observed interactions involving pH are suggested to be due to the abilities of increased temperature and slow rate of addition to sufficiently enhance the solubility of dichloromethane in the aqueous phase, which at pH 9, but not pH 6, allows partial polymer precipitation prior to drug partitioning into the aqueous phase. The interaction between temperature and rate of addition is due to the relative lack of effect of increased temperature on drug entrapment following slow rate of addition of the organic phase. In comparison to the effects of pH on drug entrapment, the contributions of the other physical factors examined were limited.
Resumo:
The effects of four process factors: pH, emulsifier (gelatin) concentration, mixing and batch, on the % w/w entrapment of propranolol hydrochloride in ethylcellulose microcapsules prepared by the solvent evaporation process were examined using a factorial design. In this design the minimum % w/w entrapments of propranolol hydrochloride were observed whenever the external aqueous phase contained 1.5% w/v gelatin at pH 6.0 (0.71-0.91% w/w) whereas maximum entrapments occurred whenever the external aqueous phase was composed of 0.5% w/v gelatin at pH 9.0,(8.9-9.1% w/w). The theoretical maximum loading was 50% w/w. Statistical evaluation of the results by analysis of variance showed that emulsifer (gelatin) concentration and pH, but not mixing and batch significantly affected entrapment. An interaction between pH and gelatin concentration was observed in the factorial design which was accredited to the greater effect of gelatin concentration on % w/w entrapment at pH 9.0 than at pH 6.0. Maximum theoretical entrapment was achieved by increasing the pH of the external phase to 12.0. Marked increases in drug entrapment were observed whenever the pH of the external phase exceeded the pK(2) of propranolol hydrochloride. It was concluded that pH, and hence ionisation, was the greatest determinant of entrapment of propranolol hydrochloride into microcapsules prepared by the solvent evaporation process.
Resumo:
This article examines the relationship between the learning organisation and the implementation of curriculum innovation within schools. It also compares the extent of innovative activity undertaken by schools in the public and the private sectors. A learning organisation is characterised by long-term goals, participatory decision-making processes, collaboration with external stakeholders, effective mechanisms for the internal communication of knowledge and information, and the use of rewards for its members. These characteristics are expected to promote curriculum innovation, once a number of control factors have been taken into account. The article reports on a study carried out in 197 Greek public and private primary schools in the 1999-2000 school year. Structured interviews with school principals were used as a method of data collection. According to the statistical results, the most important determinants of the innovative activity of a school are the extent of its collaboration with other organisations (i.e. openness to society), and the implementation of development programmes for teachers and parents (i.e. communication of knowledge and information). Contrary to expectations, the existence of long-term goals, the extent of shared decision-making, and the use of teacher rewards had no impact on curriculum innovation. The study also suggests that the private sector, as such, has an additional positive effect on the implementation of curriculum innovation, once a number of human, financial, material, and management resources have been controlled for. The study concludes by making recommendations for future research that would shed more light on unexpected outcomes and would help explore the causal link between variables in the research model.
Resumo:
A method for simulation of acoustical bores, useful in the context of sound synthesis by physical modeling of woodwind instruments, is presented. As with previously developed methods, such as digital waveguide modeling (DWM) [Smith, Comput. Music J. 16, pp 74-91 (1992)] and the multi convolution algorithm (MCA) [Martinez et al., J. Acoust. Soc. Am. 84, pp 1620-1627 (1988)], the approach is based on a one-dimensional model of wave propagation in the bore. Both the DWM method and the MCA explicitly compute the transmission and reflection of wave variables that represent actual traveling pressure waves. The method presented in this report, the wave digital modeling (WDM) method, avoids the typical limitations associated with these methods by using a more general definition of the wave variables. An efficient and spatially modular discrete-time model is constructed from the digital representations of elemental bore units such as cylindrical sections, conical sections, and toneholes. Frequency-dependent phenomena, such as boundary losses, are approximated with digital filters. The stability of a simulation of a complete acoustic bore is investigated empirically. Results of the simulation of a full clarinet show that a very good concordance with classic transmission-line theory is obtained.
Resumo:
We introduce a novel method to simulate hydrated macromolecules with a dielectric continuum representation of the surrounding solvent. In our approach, the interaction between the solvent and the molecular degrees of freedom is described by means of a polarization density free energy functional which is minimum at electrostatic equilibrium. After a pseudospectral expansion of the polarization and a discretization of the functional, we construct the equations of motion for the system based on a Car-Parrinello technique. In the limit of the adiabatic evolution of the polarization field variables, our method provides the solution of the dielectric continuum problem "on the fly," while the molecular coordinates are propagated. In this first study, we show how our dielectric continuum molecular dynamics method can be successfully applied to hydrated biomolecules, with low cost compared to free energy simulations with explicit solvent. To our knowledge, this is the first time that stable and conservative molecular dynamic simulations of solutes can be performed for a dielectric continuum model of the solvent. (C) 2001 American Institute of Physics.
Resumo:
Roche tomography is a technique used for imaging the Roche-lobe-filling secondary stars in cataclysmic variables (CVs). In order to interpret Roche tomograms correctly, one must determine whether features in the reconstruction are real, or the result of statistical or systematic errors. We explore the effects of systematic errors using reconstructions of simulated data sets, and show that systematic errors result in characteristic distortions of the final reconstructions that can be identified and corrected. In addition, we present a new method of estimating statistical errors on tomographic reconstructions using a Monte Carlo bootstrapping algorithm, and show this method to be much more reliable than Monte Carlo methods which 'jiggle' the data points in accordance with the size of their error bars.
Resumo:
The trophic link density and the stability of food webs are thought to be related, but the nature of this relation is controversial. This article introduces a method for estimating the link density from diet tables which do not cover the complete food web and do not resolve all diet items to species level. A simple formula for the error of this estimate is derived. Link density is determined as a function of a threshold diet fraction below which diet items are ignored (
Resumo:
Generally, the solid and liquid fractions (digestate) from Anaerobic Digestion (AD) energy production are considered as waste. This has a negative impact on the sustainability of AD processes because of the financial outlay required to treat digestate before being discharged into municipal water treatment plants or natural water bodies. The main aim of this research was to investigate feasibility of producing an organic fertiliser using anaerobic digestate and limestone powders as the raw materials employing a high shear granulation process. Two-level factorial experimental design was used to determine the influence of granulation process variables on, the strength, resistance to attrition and yield of the granules. It was concluded from the study that it is technically feasible to produce organic fertiliser granules of acceptable strength and product yield. Increasing the liquid-to-solid ratio during granulation leads to increased granule strength and better product yield. Although the strength of the granules produced was lower than typical strength of commercial synthetic fertiliser granules (about 5 to 7. MPa), this could be improved by mixing the digestate with a polymeric binder or coating the particles post granulation. © 2012 Elsevier B.V.
Resumo:
Aim. This paper is a report of a study to explore link nurses' views and experiences regarding the development, barriers and facilitators to the implementation of the role in palliative care in the nursing home.
Background. The delivery of palliative care in nursing homes is widely advocated; one approach is to develop the link nurse role to cascade good practice and training to nurses and other care staff to enhance patient care.
Method. A descriptive qualitative study was conducted with a purposive sample of 14 link nurses from 10 nursing homes in Northern Ireland during 2006. Three focus groups, composed of all Registered Nurses currently acting as link nurses in their nursing homes participated, and the data were audio recorded, fully transcribed and content analysed.
Findings. The link nurse system shows potential to enhance palliative care within nursing homes. However, link nurses experienced a number of difficulties in implementing education programmes. Facilitators of the role included external support, monthly meetings, access to a resource file and peer support among link nurses themselves. Lack of management support, a transient workforce and lack of adequate preparation for link nurses were barriers to fulfilling this role.
Conclusion. Whilst palliative care link nurses can improve care for residents in nursing homes, consideration must be given to overcome the types of barriers identified in order to enable the link nurse system to function effectively. © 2008 The Authors.
Resumo:
Arsenic (As) contamination of rice plants can result in high total As concentrations (t-As) in cooked rice, especially if As-contaminated water is used for cooking. This study examines two variables: (1) the cooking method (water volume and inclusion of a washing step); and (2) the rice type (atab and boiled). Cooking water and raw atab and boiled rice contained 40 g As l-1 and 185 and 315 g As kg-1, respectively. In general, all cooking methods increased t-As from the levels in raw rice; however, raw boiled rice decreased its t-As by 12.7% when cooked by the traditional method, but increased by 15.9% or 23.5% when cooked by the intermediate or contemporary methods, respectively. Based on the best possible scenario (the traditional cooking method leading to the lowest level of contamination, and the atab rice type with the lowest As content), t-As daily intake was estimated to be 328 g, which was twice the tolerable daily intake of 150 g.
Resumo:
We present Roche tomograms of the K4V secondary star in the cataclysmic variable AE Aqr, reconstructed from two data sets taken 9 d apart, and measure the differential rotation of the stellar surface. The tomograms show many large, cool starspots, including a large high-latitude spot and a prominent appendage down the trailing hemisphere. We find two distinct bands of spots around 22° and 43° latitude, and estimate a spot coverage of 15.4-17 per cent on the Northern hemisphere. Assuming a solar-like differential rotation law, the differential rotation of AE Aqr was measured using two different techniques. The first method yields an equator-pole lap time of 269 d and the second yields a lap time of 262 d. This shows that the star is not fully tidally locked, as was previously assumed for CVs, but has a co-rotation latitude of ˜40°. We discuss the implications that these observations have on stellar dynamo theory, as well as the impact that spot traversal across the L1 point may have on accretion rates in CVs as well as some of their other observed properties. The entropy landscape technique was applied to determine the system parameters of AE Aqr. For the two independent data sets, we find M1 = 1.20 and 1.17 M⊙, M2 = 0.81 and 0.78 M⊙, and orbital inclinations of 50° to 51° at optimal systemic velocities of γ = -64.7 and -62.9 km s-1.
Resumo:
Virtual metrology (VM) aims to predict metrology values using sensor data from production equipment and physical metrology values of preceding samples. VM is a promising technology for the semiconductor manufacturing industry as it can reduce the frequency of in-line metrology operations and provide supportive information for other operations such as fault detection, predictive maintenance and run-to-run control. The prediction models for VM can be from a large variety of linear and nonlinear regression methods and the selection of a proper regression method for a specific VM problem is not straightforward, especially when the candidate predictor set is of high dimension, correlated and noisy. Using process data from a benchmark semiconductor manufacturing process, this paper evaluates the performance of four typical regression methods for VM: multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), neural networks (NN) and Gaussian process regression (GPR). It is observed that GPR performs the best among the four methods and that, remarkably, the performance of linear regression approaches that of GPR as the subset of selected input variables is increased. The observed competitiveness of high-dimensional linear regression models, which does not hold true in general, is explained in the context of extreme learning machines and functional link neural networks.
Resumo:
We present a novel method for the light-curve characterization of Pan-STARRS1 Medium Deep Survey (PS1 MDS) extragalactic sources into stochastic variables (SVs) and burst-like (BL) transients, using multi-band image-differencing time-series data. We select detections in difference images associated with galaxy hosts using a star/galaxy catalog extracted from the deep PS1 MDS stacked images, and adopt a maximum a posteriori formulation to model their difference-flux time-series in four Pan-STARRS1 photometric bands gP1, rP1, iP1, and zP1. We use three deterministic light-curve models to fit BL transients; a Gaussian, a Gamma distribution, and an analytic supernova (SN) model, and one stochastic light-curve model, the Ornstein-Uhlenbeck process, in order to fit variability that is characteristic of active galactic nuclei (AGNs). We assess the quality of fit of the models band-wise and source-wise, using their estimated leave-out-one cross-validation likelihoods and corrected Akaike information criteria. We then apply a K-means clustering algorithm on these statistics, to determine the source classification in each band. The final source classification is derived as a combination of the individual filter classifications, resulting in two measures of classification quality, from the averages across the photometric filters of (1) the classifications determined from the closest K-means cluster centers, and (2) the square distances from the clustering centers in the K-means clustering spaces. For a verification set of AGNs and SNe, we show that SV and BL occupy distinct regions in the plane constituted by these measures. We use our clustering method to characterize 4361 extragalactic image difference detected sources, in the first 2.5 yr of the PS1 MDS, into 1529 BL, and 2262 SV, with a purity of 95.00% for AGNs, and 90.97% for SN based on our verification sets. We combine our light-curve classifications with their nuclear or off-nuclear host galaxy offsets, to define a robust photometric sample of 1233 AGNs and 812 SNe. With these two samples, we characterize their variability and host galaxy properties, and identify simple photometric priors that would enable their real-time identification in future wide-field synoptic surveys.
Resumo:
We present a method for learning Bayesian networks from data sets containing thousands of variables without the need for structure constraints. Our approach is made of two parts. The first is a novel algorithm that effectively explores the space of possible parent sets of a node. It guides the exploration towards the most promising parent sets on the basis of an approximated score function that is computed in constant time. The second part is an improvement of an existing ordering-based algorithm for structure optimization. The new algorithm provably achieves a higher score compared to its original formulation. Our novel approach consistently outperforms the state of the art on very large data sets.
Resumo:
A search query, being a very concise grounding of user intent, could potentially have many possible interpretations. Search engines hedge their bets by diversifying top results to cover multiple such possibilities so that the user is likely to be satisfied, whatever be her intended interpretation. Diversified Query Expansion is the problem of diversifying query expansion suggestions, so that the user can specialize the query to better suit her intent, even before perusing search results. We propose a method, Select-Link-Rank, that exploits semantic information from Wikipedia to generate diversified query expansions. SLR does collective processing of terms and Wikipedia entities in an integrated framework, simultaneously diversifying query expansions and entity recommendations. SLR starts with selecting informative terms from search results of the initial query, links them to Wikipedia entities, performs a diversity-conscious entity scoring and transfers such scoring to the term space to arrive at query expansion suggestions. Through an extensive empirical analysis and user study, we show that our method outperforms the state-of-the-art diversified query expansion and diversified entity recommendation techniques.