995 resultados para storage media


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nearly one fourth of new medicinal molecules are biopharmaceutical (protein, antibody or nucleic acid derivative) based. However, the administration of these compounds is not always that straightforward due to the fragile nature of aforementioned domains in GI-tract. In addition, these molecules often exhibit poor bioavailability when administered orally. As a result, parenteral administration is commonly preferred. In addition, shelf-life of these molecules in aqueous environments is poor, unless stored in low temperatures. Another approach is to bring these molecules to anhydrous form via lyophilization resulting in enhanced stability during storage. Proteins cannot most commonly be freeze dried by themselves so some kind of excipients are nearly always necessary. Disaccharides are commonly utilized excipients in freeze-dried formulations since they provide a rigid glassy matrix to maintain the native conformation of the protein domain. They also act as "sink"-agents, which basically mean that they can absorb some moisture from the environment and still help to protect the API itself to retain its activity and therefore offer a way to robust formulation. The aim of the present study was to investigate how four amorphous disaccharides (cellobiose, melibiose, sucrose and trehalose) behave when they are brought to different relative humidity levels. At first, solutions of each disaccharide were prepared, filled into scintillation vials and freeze dried. Initial information on how the moisture induced transformations take place, the lyophilized amorphous disaccharide cakes were placed in vacuum desiccators containing different relative humidity levels for defined period, after which selected analyzing methods were utilized to further examine the occurred transformations. Affinity to crystallization, water sorption of the disaccharides, the effect of moisture on glass transition and crystallization temperature were studied. In addition FT-IR microscopy was utilized to map the moisture distribution on a piece of lyophilized cake. Observations made during the experiments backed up the data mentioned in a previous study: melibiose and trehalose were shown to be superior over sucrose and cellobiose what comes to the ability to withstand elevated humidity and temperature, and to avoid crystallization with pharmaceutically relevant moisture contents. The difference was made evident with every utilized analyzing method. In addition, melibiose showed interesting anomalies during DVS runs, which were absent with other amorphous disaccharides. Particularly fascinating was the observation made with polarized light microscope, which revealed a possible small-scale crystallization that cannot be observed with XRPD. As a result, a suggestion can safely be made that a robust formulation is most likely obtained by utilizing either melibiose or trehalose as a stabilizing agent for biopharmaceutical freeze-dried formulations. On the other hand, more experiments should be conducted to obtain more accurate information on why these disaccharides have better tolerance for elevating humidities than others.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Six models (Simulators) are formulated and developed with all possible combinations of pressure and saturation of the phases as primary variables. A comparative study between six simulators with two numerical methods, conventional simultaneous and modified sequential methods are carried out. The results of the numerical models are compared with the laboratory experimental results to study the accuracy of the model especially in heterogeneous porous media. From the study it is observed that the simulator using pressure and saturation of the wetting fluid (PW, SW formulation) is the best among the models tested. Many simulators with nonwetting phase as one of the primary variables did not converge when used along with simultaneous method. Based on simulator 1 (PW, SW formulation), a comparison of different solution methods such as simultaneous method, modified sequential and adaptive solution modified sequential method are carried out on 4 test problems including heterogeneous and randomly heterogeneous problems. It is found that the modified sequential and adaptive solution modified sequential methods could save the memory by half and as also the CPU time required by these methods is very less when compared with that using simultaneous method. It is also found that the simulator with PNW and PW as the primary variable which had problem of convergence using the simultaneous method, converged using both the modified sequential method and also using adaptive solution modified sequential method. The present study indicates that pressure and saturation formulation along with adaptive solution modified sequential method is the best among the different simulators and methods tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present study, titanium nitride which shows exceptional stability, extreme corrosion resistance, good electronic conductivity and adhesion behaviour is used to support platinum particles and then used for methanol oxidation in an alkaline medium. The catalyst shows very good CO tolerance for the electrochemical oxidation of methanol. In situ infrared spectroelectrochemical data show the remarkable ability of TiN to decompose water at low over potentials leading to -OH type functional groups on its surface which in turn help in alleviating the carbon monoxide poisoning associated with methanol oxidation. TiN supported catalysts are found to be very good in terms of long term stability, exchange current density and stable currents at low over voltages. Supporting evidence from X-ray photoelectron spectroscopic data and cyclic voltammetry clearly demonstrates the usefulness of TiN supported Pt catalysts for efficient methanol oxidation in alkaline media.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Light scattering, or scattering and absorption of electromagnetic waves, is an important tool in all remote-sensing observations. In astronomy, the light scattered or absorbed by a distant object can be the only source of information. In Solar-system studies, the light-scattering methods are employed when interpreting observations of atmosphereless bodies such as asteroids, atmospheres of planets, and cometary or interplanetary dust. Our Earth is constantly monitored from artificial satellites at different wavelengths. With remote sensing of Earth the light-scattering methods are not the only source of information: there is always the possibility to make in situ measurements. The satellite-based remote sensing is, however, superior in the sense of speed and coverage if only the scattered signal can be reliably interpreted. The optical properties of many industrial products play a key role in their quality. Especially for products such as paint and paper, the ability to obscure the background and to reflect light is of utmost importance. High-grade papers are evaluated based on their brightness, opacity, color, and gloss. In product development, there is a need for computer-based simulation methods that could predict the optical properties and, therefore, could be used in optimizing the quality while reducing the material costs. With paper, for instance, pilot experiments with an actual paper machine can be very time- and resource-consuming. The light-scattering methods presented in this thesis solve rigorously the interaction of light and material with wavelength-scale structures. These methods are computationally demanding, thus the speed and accuracy of the methods play a key role. Different implementations of the discrete-dipole approximation are compared in the thesis and the results provide practical guidelines in choosing a suitable code. In addition, a novel method is presented for the numerical computations of orientation-averaged light-scattering properties of a particle, and the method is compared against existing techniques. Simulation of light scattering for various targets and the possible problems arising from the finite size of the model target are discussed in the thesis. Scattering by single particles and small clusters is considered, as well as scattering in particulate media, and scattering in continuous media with porosity or surface roughness. Various techniques for modeling the scattering media are presented and the results are applied to optimizing the structure of paper. However, the same methods can be applied in light-scattering studies of Solar-system regoliths or cometary dust, or in any remote-sensing problem involving light scattering in random media with wavelength-scale structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates the process of producing interactivity in a converged media environment. The study asks whether more media convergence equals more interactivity. The research object is approached through semi-structured interviews of prominent decision makers within the Finnish media. The main focus of the study are the three big ones of the traditional media, radio, television and the printing press, and their ability to adapt to the changing environment. The study develops theoretical models for the analysis of interactive features and convergence. Case-studies are formed from the interview data and they are evaluated against the models. As a result the cases arc plotted and compared on a four-fold table. The cases are Radio Rock, NRJ, Biu Brother, Television Chat, Olivia and Sanoma News. It is found out that the theoretical models can accurately forecast the results of the case studies. The models are also able to distinguish different aspects of both interactivity and convergence so that a case, which at a first glance seems not to be very interactive is in the end found out to receive second highest scores on the analysis. The highest scores are received by Big Brother and Sanoma News. Through the theory and the analysis of the research data it is found out that the concepts of interactivity and convergence arc intimately intertwined and very hard in many cases to separate from each other. Hence the answer to the main question of this study is yes, convergence does promote interactivity and audience participation. The main theoretical background for the analysis of interactivity follows the work of Came Fleeter, Spiro Kiousis and Sally McMillan. Heeler's six-dimensional definition of interactivity is used as the basis for operationalizing interactivity. The actor-network theory is used as the main theoretical framework to analyze convergence. The definition and operationalization of the actor-network theory into a model of convergence follows the work of Michel Callon. Bruno Latour and especially John Law and Felix Stalder.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A computer-controlled laser writing system for optical integrated circuits and data storage is described. The system is characterized by holographic (649F) and high-resolution plates. A minimum linewidth of 2.5 mum is obtained by controlling the system parameters. We show that this system can also be used for data storage applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrogen storage in the three-dimensional carbon foams is analyzed using classical grand canonical Monte Carlo simulations. The calculated storage capacities of the foams meet the material-based DOE targets and are comparable to the capacities of a bundle of well-separated similar diameter open nanotubes. The pore sizes in the foams are optimized for the best hydrogen uptake. The capacity depends sensitively on the C-H-2 interaction potential, and therefore, the results are presented for its ``weak'' and ``strong'' choices, to offer the lower and upper bounds for the expected capacities. Furthermore, quantum effects on the effective C-H-2 as well as H-2-H-2 interaction potentials are considered. We find that the quantum effects noticeably change the adsorption properties of foams and must be accounted for even at room temperature.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the distributed storage setting that we consider, data is stored across n nodes in the network such that the data can be recovered by connecting to any subset of k nodes. Additionally, one can repair a failed node by connecting to any d nodes while downloading beta units of data from each. Dimakis et al. show that the repair bandwidth d beta can be considerably reduced if each node stores slightly more than the minimum required and characterize the tradeoff between the amount of storage per node and the repair bandwidth. In the exact regeneration variation, unlike the functional regeneration, the replacement for a failed node is required to store data identical to that in the failed node. This greatly reduces the complexity of system maintenance. The main result of this paper is an explicit construction of codes for all values of the system parameters at one of the two most important and extreme points of the tradeoff - the Minimum Bandwidth Regenerating point, which performs optimal exact regeneration of any failed node. A second result is a non-existence proof showing that with one possible exception, no other point on the tradeoff can be achieved for exact regeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the distributed storage setting introduced by Dimakis et al., B units of data are stored across n nodes in the network in such a way that the data can be recovered by connecting to any k nodes. Additionally one can repair a failed node by connecting to any d nodes while downloading at most beta units of data from each node. In this paper, we introduce a flexible framework in which the data can be recovered by connecting to any number of nodes as long as the total amount of data downloaded is at least B. Similarly, regeneration of a failed node is possible if the new node connects to the network using links whose individual capacity is bounded above by beta(max) and whose sum capacity equals or exceeds a predetermined parameter gamma. In this flexible setting, we obtain the cut-set lower bound on the repair bandwidth along with a constructive proof for the existence of codes meeting this bound for all values of the parameters. An explicit code construction is provided which is optimal in certain parameter regimes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The EEG time series has been subjected to various formalisms of analysis to extract meaningful information regarding the underlying neural events. In this paper the linear prediction (LP) method has been used for analysis and presentation of spectral array data for the better visualisation of background EEG activity. It has also been used for signal generation, efficient data storage and transmission of EEG. The LP method is compared with the standard Fourier method of compressed spectral array (CSA) of the multichannel EEG data. The autocorrelation autoregressive (AR) technique is used for obtaining the LP coefficients with a model order of 15. While the Fourier method reduces the data only by half, the LP method just requires the storage of signal variance and LP coefficients. The signal generated using white Gaussian noise as the input to the LP filter has a high correlation coefficient of 0.97 with that of original signal, thus making LP as a useful tool for storage and transmission of EEG. The biological significance of Fourier method and the LP method in respect to the microstructure of neuronal events in the generation of EEG is discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have modeled the rotation curves of 21 galaxies observed by Amram et al. (1992), by combining the effects of rigid rotation, gravity, and turbulence. The main motivation behind such modeling is to study the formation of coherent structures in turbulent media and explore its role in the large-scale structures of the universe. The values of the parameters such as mass, turbulent velocity, and angular velocity derived from the rotation curve fits are in good agreement with those derived from the prevalent models.