915 resultados para Performance evaluation. Competencies. Pharmaceutical industry. Strategy. Drug sellers propagandists
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
A critical review of the properties of fusidic acid and the analytical methods for its determination
Resumo:
Fusidic acid, an antibiotic produced from the Fusidium coccineum fungus, belongs to the class of steroids, but has no corticosteroid effects. It is indicated for the treatment of infections caused by methicillin-resistant Staphylococcus aureus strains. The aim of this study was to search for the properties of fusidic acid published so far in the literature, as well as the methods developed for its determination in biological samples and pharmaceutical formulations. From the findings, we can conclude that fusidic acid has been used for decades and is indicated for the treatment of serious infections caused by Gram-positive microorganisms to this day. Furthermore, it is a hypoallergenic agent, has low toxicity, shows low resistance, and has no cross-resistance with other clinically used antibiotics. The analytical method of high-performance liquid chromatography has been widely used, since it can reduce the cost and time of analysis, making it more viable for routine quality control in the pharmaceutical industry.
Resumo:
This study evaluated a nonlinear programming excel workbook PPFR (http://www.fmva.unesp.br/ppfr) for determining the optimum nutrient density and maximize margins. Two experiments were conducted with 240 one-day-old female chicks and 240 one-day-old male chicks distributed in 48 pens (10 chicks per pen, 4 replicates) in a completely randomized design. The treatments include the average price history (2009s and 2010s) for broiler increased and decreased by 25% or 50% (5 treatments to nonlinear feed formulation) and 1 linear feed formulation. Body gain, feed intake, feed conversion were measured at 21, 42 and 56 d of age. Chicks had ad libitum access to feed and water in floor pens with wood shavings as litter. The bio-economic Energy Conversion [BEC= (Total energy intake*Feed weighted cost per kg)/ (Weight gain*kg live chicken cost)] was more sensitive for measuring the bio-economic performance for broilers, and especially with better magnitude. This allowed a better assessment of profitability, the rate of growth and not just energy consumption, the production of broilers, by incorporating energy consumption, allowing for more sensitivity to the new index (BEC). The BEC was demonstrated that the principle of nonlinear formulation minimizes losses significantly (P<0.05), especially under unfavorable conditions the price of chicken in the market. Thus, when considering that a diet of energy supply shows up as the most expensive item of a formulation, it should compose necessarily the formula proposed for a bio-economic index. Thus, there is need to evaluate more accurately, not only the ingredients of a ration, but the impact of nutrients on the stability of a solution, mainly due to the energy requirement. This strategy promotes better accuracy for decision making under conditions of uncertainty, to find alternative post-formulation. From the above, both weight gain and feed conversion, as traditional performance indicators, cannot finalize or predict a performance evaluation of an economic system creating increasingly intense and competitive. Thus, the energy concentration of the diet becomes more important definition to feed formulator, by directly impact profit activity by interactions with the density of nutrients. This allowed a better evaluation of profitability, the rate of energy performance for broilers, by incorporating the energy consumption formula, allowing more sensitivity to the new index (BEC). These data show that nonlinear feed formulation is a toll to offer new opportunities for poultry production to improved profitability.
Resumo:
Most consumers consider the fat of chicken meat undesirable for a healthy diet, due to the high levels of saturated fatty acids and cholesterol. The purpose of this experiment was to investigate the influence of changes in dietary metabolizable energy level, associated with a proportional nutrient density variation, on broiler chickens performance and on the lipid composition of meat. Males and females Cobb 500 broilers were evaluated separately. Performance evaluation followed a completely randomized design with factorial 6x3 arrangement - six energy levels (2,800, 2,900, 3,000, 3,100, 3,200 and 3,300 kcal/kg) and three slaughter ages (42, 49 and 56 days). Response surface methodology was used to establish a mathematical model to explain live weight, feed intake and feed conversion behavior. Total lipids and cholesterol were determined in skinned breast meat and in thigh meat, with and without skin. For lipid composition analysis, a 3x3x2 factorial arrangement in a completely randomized design - three ration’s metabolizable energy levels (2,800, 3,000 and 3,300 kcal/kg), three slaughter ages (42, 49 and 56 days) and two sexes - was used. The reduction in the diet metabolizable energy up to close to 3,000 kcal/kg did not affect live weight but, below this value, the live weight decreased. Feed intake was lower when the dietary energy level was higher. Feed conversion was favored in a direct proportion to the increase of the energy level of the diet. The performance of all birds was within the range considered appropriate for the lineage. Breast meat had less total lipids and cholesterol than thigh meat. Thigh with skin had more than the double of total lipids of skinned thigh, but the cholesterol content did not differ with the removal of the skin, suggesting that cholesterol content is not associated with the subcutaneous fat. Intramuscular fat content was lower in the meat from birds fed diets with lower energy level. These results may help to define the most appropriate nutritional management. Despite the decrease in bird’s productive performance, the restriction of energy in broiler chickens feed may be a viable alternative, if the consumers are willing to pay more for meat with less fat.
Validation of analytical methodology for quantification of cefazolin sodium by liquid chromatography
Resumo:
A reversed-phase high performance liquid chromatography method was validated for the determination of cefazolin sodium in lyophilized powder for solution for injection to be applied for quality control in pharmaceutical industry. The liquid chromatography method was conducted on a Zorbax Eclipse Plus C18 column (250 x 4.6 mm, 5 μm), maintained at room temperature. The mobile phase consisted of purified water: acetonitrile (60: 40 v/v), adjusted to pH 8 with triethylamine. The flow rate was of 0.5 mL min-1 and effluents were monitored at 270 nm. The retention time for cefazolin sodium was 3.6 min. The method proved to be linear (r2 =0.9999) over the concentration range of 30-80 µg mL-1. The selectivity of the method was proven through degradation studies. The method demonstrated satisfactory results for precision, accuracy, limits of detection and quantitation. The robustness of this method was evaluated using the Plackett–Burman fractional factorial experimental design with a matrix of 15 experiments and the statistical treatment proposed by Youden and Steiner. Finally, the proposed method could be also an advantageous option for the analysis of cefazolin sodium, contributing to improve the quality control and to assure the therapeutic efficacy
Resumo:
The pharmaceutical industry invests heavily in promoting their products, and studies suggest that these actions influence doctor’s prescribing. Therefore, this study aimed to analyze the opinions and attitudes of doctors when facing promotional activities of the laboratories. To this end, questionnaires were sent to doctors in Araraquara (SP) containing statements on the subject. Data analysis included study of the association by the chi-square. The results indicated that physicians relate to the propagandists (98%) by considering them useful (55%), but not as a main source update (86%). For 62% of them their prescriptions are not influenced by such relationships, while 24% disagree that doctors in general are influenced as well as new graduates (37%). The majority also disagrees that are influenced by amenities (86%) or free samples (70%) but only 38% believe their colleagues are not influenced by the samples. As for the ethics of these receipts, 57% considered to be appropriate when benefit patients, but only 32% while for personal use. The results show that doctors are vulnerable to the influences of marketing. Therefore, mechanisms and interventions are needed for prescribing drugs solely by criteria of effectiveness, safety, convenience and accessibility to the patient.
Resumo:
Antimicrobials have unquestionable importance in the control of many diseases; however the constant concern with evolution of resistant microorganisms is increasing. The ertapenem sodium is a β-lactam antibiotic of the carbapenem class, which it has a broader activity spectrum than most other β-lactam antimicrobials, and is more resistant to the enzyme β-lactamase, which is the main mechanism of resistance of many bacteria. The progress of microbial resistance to existing antibiotics is alarming. Thus we need to preserve antimicrobials that still have activity against these pathogens. In this context, the quality control has a key role to ensure the correct dosage, by contributing preventively to minimize the development of resistant microorganisms. Study of the physicochemical characteristics of the drug and the quantification of the content of active substance are of fundamental importance for the pharmaceutical industry to ensure the quality of the product sold. This work presents a literature survey of existing methods for ertapenem sodium quantification which was performed. Ertapenem sodium can be analyzed by many types of assays; however the HPLC is the most used method. This review will examine the published analytical methods reported for determination of ertapenem sodium, in biological fluids and formulations.
Resumo:
Pharmacovigilance is responsible for monitoring the safety of medicines in normal clinical use andduring clinical trials. Legal requirements for pharmacovigilance in some Latin American countries (Argentina, Brazil, Chile, Paraguay and Uruguay) were reviewed. Disparities in the legal framework among the countries are observed being those for marketing authorization holders one of the most evident. Theactive rol of the universities and drug information centers for/of pharmacovilance seems to be a positivecommon point. Legal requirements regarding pharmacovigilance of biosimilar medicines, is still a pointto be developed.
Resumo:
The Dipteryx alata is a tree specie with possibility of use in human and animal nutrition, and in the pharmaceutical industry. For reclamation of degraded areas, the revegetation has been an alternative, however, requires fertilizer addition. The objectives of this study were to evaluate the nutritional status and growth of Dipteryx alata seedlings, introduced in degraded soil under recovery process, with residues (organic and agroindustrial), compared to collected seedlings in preserved Cerrado, and evaluate the residues impact on soil chemical properties. In this work the degraded soil received the incorporation of residues, organic - RO (macrophytes) and agroindustrial - RA (ash derived from burning bagasse from sugarcane), with the following doses: 0, 16 and 32 t ha- 1 and 0, 15, 30 and 45 t ha-1 respectively. Within three months of incorporation of residues into the degraded soil, the D. alata seedlings were introduced in the experimental area, and 12 months later were evaluated for height, stem diameter, chlorophyll content and leaf contents of N, P, K, Ca, Mg, S, B, Cu, Fe, Mn and Zn. For purposes of comparison, the foliar concentration of those elements was determined in Dipteryx alata seedlings collected in preserved Cerrado area. Concomitant with the collect of leaves, at Cerrado and experimental area, soil was collected (0.0 - 0.20m deep) for evaluation of chemical parameters (P, OM, pH, K, Ca, Mg, H + Al, Al, Cu, Fe, Mn and Zn). Comparing the seedlings collected in the Cerrado with the seedlings from experimental area it is observed that the leaf concentration of N, P, K and Mg was higher in seedlings from preserved Cerrado in relation to those introduced in the experimental area. Fe, Mn and Zn, have lower foliar concentration in plants collected in the Cerrado, in the case of Mn the worst results occur in the absence of macrophytes indicating the importance of organic residue. The foliar concentration of Ca, S and Cu was similar in...
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The web services (WS) technology provides a comprehensive solution for representing, discovering, and invoking services in a wide variety of environments, including Service Oriented Architectures (SOA) and grid computing systems. At the core of WS technology lie a number of XML-based standards, such as the Simple Object Access Protocol (SOAP), that have successfully ensured WS extensibility, transparency, and interoperability. Nonetheless, there is an increasing demand to enhance WS performance, which is severely impaired by XML's verbosity. SOAP communications produce considerable network traffic, making them unfit for distributed, loosely coupled, and heterogeneous computing environments such as the open Internet. Also, they introduce higher latency and processing delays than other technologies, like Java RMI and CORBA. WS research has recently focused on SOAP performance enhancement. Many approaches build on the observation that SOAP message exchange usually involves highly similar messages (those created by the same implementation usually have the same structure, and those sent from a server to multiple clients tend to show similarities in structure and content). Similarity evaluation and differential encoding have thus emerged as SOAP performance enhancement techniques. The main idea is to identify the common parts of SOAP messages, to be processed only once, avoiding a large amount of overhead. Other approaches investigate nontraditional processor architectures, including micro-and macrolevel parallel processing solutions, so as to further increase the processing rates of SOAP/XML software toolkits. This survey paper provides a concise, yet comprehensive review of the research efforts aimed at SOAP performance enhancement. A unified view of the problem is provided, covering almost every phase of SOAP processing, ranging over message parsing, serialization, deserialization, compression, multicasting, security evaluation, and data/instruction-level processing.
Resumo:
Background: In the analysis of effects by cell treatment such as drug dosing, identifying changes on gene network structures between normal and treated cells is a key task. A possible way for identifying the changes is to compare structures of networks estimated from data on normal and treated cells separately. However, this approach usually fails to estimate accurate gene networks due to the limited length of time series data and measurement noise. Thus, approaches that identify changes on regulations by using time series data on both conditions in an efficient manner are demanded. Methods: We propose a new statistical approach that is based on the state space representation of the vector autoregressive model and estimates gene networks on two different conditions in order to identify changes on regulations between the conditions. In the mathematical model of our approach, hidden binary variables are newly introduced to indicate the presence of regulations on each condition. The use of the hidden binary variables enables an efficient data usage; data on both conditions are used for commonly existing regulations, while for condition specific regulations corresponding data are only applied. Also, the similarity of networks on two conditions is automatically considered from the design of the potential function for the hidden binary variables. For the estimation of the hidden binary variables, we derive a new variational annealing method that searches the configuration of the binary variables maximizing the marginal likelihood. Results: For the performance evaluation, we use time series data from two topologically similar synthetic networks, and confirm that our proposed approach estimates commonly existing regulations as well as changes on regulations with higher coverage and precision than other existing approaches in almost all the experimental settings. For a real data application, our proposed approach is applied to time series data from normal Human lung cells and Human lung cells treated by stimulating EGF-receptors and dosing an anticancer drug termed Gefitinib. In the treated lung cells, a cancer cell condition is simulated by the stimulation of EGF-receptors, but the effect would be counteracted due to the selective inhibition of EGF-receptors by Gefitinib. However, gene expression profiles are actually different between the conditions, and the genes related to the identified changes are considered as possible off-targets of Gefitinib. Conclusions: From the synthetically generated time series data, our proposed approach can identify changes on regulations more accurately than existing methods. By applying the proposed approach to the time series data on normal and treated Human lung cells, candidates of off-target genes of Gefitinib are found. According to the published clinical information, one of the genes can be related to a factor of interstitial pneumonia, which is known as a side effect of Gefitinib.
Resumo:
The study is aimed to calculate an innovative numerical index for bit performance evaluation called Bit Index (BI), applied on a new type of bit database named Formation Drillability Catalogue (FDC). A dedicated research programme (developed by Eni E&P and the University of Bologna) studied a drilling model for bit performance evaluation named BI, derived from data recorded while drilling (bit records, master log, wireline log, etc.) and dull bit evaluation. This index is calculated with data collected inside the FDC, a novel classification of Italian formations aimed to the geotechnical and geomechanical characterization and subdivisions of the formations, called Minimum Interval (MI). FDC was conceived and prepared at Eni E&P Div., and contains a large number of significant drilling parameters. Five wells have been identified inside the FDC and have been tested for bit performance evaluation. The values of BI are calculated for each bit run and are compared with the values of the cost per metre. The case study analyzes bits of the same type, diameters and run in the same formation. The BI methodology implemented on MI classification of FDC can improve consistently the bit performances evaluation, and it helps to identify the best performer bits. Moreover, FDC turned out to be functional to BI, since it discloses and organizes formation details that are not easily detectable or usable from bit records or master logs, allowing for targeted bit performance evaluations. At this stage of development, the BI methodology proved to be economic and reliable. The quality of bit performance analysis obtained with BI seems also more effective than the traditional “quick look” analysis, performed on bit records, or on the pure cost per metre evaluation.
Resumo:
The thesis deals with channel coding theory applied to upper layers in the protocol stack of a communication link and it is the outcome of four year research activity. A specific aspect of this activity has been the continuous interaction between the natural curiosity related to the academic blue-sky research and the system oriented design deriving from the collaboration with European industry in the framework of European funded research projects. In this dissertation, the classical channel coding techniques, that are traditionally applied at physical layer, find their application at upper layers where the encoding units (symbols) are packets of bits and not just single bits, thus explaining why such upper layer coding techniques are usually referred to as packet layer coding. The rationale behind the adoption of packet layer techniques is in that physical layer channel coding is a suitable countermeasure to cope with small-scale fading, while it is less efficient against large-scale fading. This is mainly due to the limitation of the time diversity inherent in the necessity of adopting a physical layer interleaver of a reasonable size so as to avoid increasing the modem complexity and the latency of all services. Packet layer techniques, thanks to the longer codeword duration (each codeword is composed of several packets of bits), have an intrinsic longer protection against long fading events. Furthermore, being they are implemented at upper layer, Packet layer techniques have the indisputable advantages of simpler implementations (very close to software implementation) and of a selective applicability to different services, thus enabling a better matching with the service requirements (e.g. latency constraints). Packet coding technique improvement has been largely recognized in the recent communication standards as a viable and efficient coding solution: Digital Video Broadcasting standards, like DVB-H, DVB-SH, and DVB-RCS mobile, and 3GPP standards (MBMS) employ packet coding techniques working at layers higher than the physical one. In this framework, the aim of the research work has been the study of the state-of-the-art coding techniques working at upper layer, the performance evaluation of these techniques in realistic propagation scenario, and the design of new coding schemes for upper layer applications. After a review of the most important packet layer codes, i.e. Reed Solomon, LDPC and Fountain codes, in the thesis focus our attention on the performance evaluation of ideal codes (i.e. Maximum Distance Separable codes) working at UL. In particular, we analyze the performance of UL-FEC techniques in Land Mobile Satellite channels. We derive an analytical framework which is a useful tool for system design allowing to foresee the performance of the upper layer decoder. We also analyze a system in which upper layer and physical layer codes work together, and we derive the optimal splitting of redundancy when a frequency non-selective slowly varying fading channel is taken into account. The whole analysis is supported and validated through computer simulation. In the last part of the dissertation, we propose LDPC Convolutional Codes (LDPCCC) as possible coding scheme for future UL-FEC application. Since one of the main drawbacks related to the adoption of packet layer codes is the large decoding latency, we introduce a latency-constrained decoder for LDPCCC (called windowed erasure decoder). We analyze the performance of the state-of-the-art LDPCCC when our decoder is adopted. Finally, we propose a design rule which allows to trade-off performance and latency.