991 resultados para viral-campaign performance


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Viral marketing is a form of peer-to-peer communication in which individuals are encouraged to pass on promotional messages within their social networks. Conventional wisdom holds that the viral marketing process is both random and unmanageable. In this paper, we deconstruct the process and investigate the formation of the activated digital network as distinct from the underlying social network. We then consider the impact of the social structure of digital networks (random, scale free, and small world) and of the transmission behavior of individuals on campaign performance. Specifically, we identify alternative social network models to understand the mediating effects of the social structures of these models on viral marketing campaigns. Next, we analyse an actual viral marketing campaign and use the empirical data to develop and validate a computer simulation model for viral marketing. Finally, we conduct a number of simulation experiments to predict the spread of a viral message within different types of social network structures under different assumptions and scenarios. Our findings confirm that the social structure of digital networks play a critical role in the spread of a viral message. Managers seeking to optimize campaign performance should give consideration to these findings before designing and implementing viral marketing campaigns. We also demonstrate how a simulation model is used to quantify the impact of campaign management inputs and how these learnings can support managerial decision making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Designing effective direct mail pieces is considered a key success factor in direct marketing. However, related published empirical research is scarce while design recommendations are manifold and often conflicting. Compared with prior work, our study aims to provide more elaborate and empirically validated findings for the effects of direct mail design characteristics by analyzing 677 direct mail campaigns from non-profit organizations and financial service providers. We investigate the effects of (1) various envelope characteristics and observable cues on opening rates, and (2) characteristics of the envelope content on the keeping rates of direct mail campaigns. We show that visual design elements on the outer envelope – rather than sender-related details – are the predominant drivers of opening rates. Factors such as letter length, provision of sender information in the letter, and personalization positively influence the keeping rate. We also observe that opening and keeping rates are uncorrelated at the campaign level, implying that opening direct mail pieces is only a necessary condition for responding to offers, but not per se a driver of direct mail response.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE: We sought to evaluate the performance of the human papillomavirus high-risk DNA test in patients 30 years and older. MATERIALS AND METHODS: Screening (n=835) and diagnosis (n=518) groups were defined based on prior Papanicolaou smear results as part of a clinical trial for cervical cancer detection. We compared the Hybrid Capture II (HCII) test result with the worst histologic report. We used cervical intraepithelial neoplasia (CIN) 2/3 or worse as the reference of disease. We calculated sensitivities, specificities, positive and negative likelihood ratios (LR+ and LR-), receiver operating characteristic (ROC) curves, and areas under the ROC curves for the HCII test. We also considered alternative strategies, including Papanicolaou smear, a combination of Papanicolaou smear and the HCII test, a sequence of Papanicolaou smear followed by the HCII test, and a sequence of the HCII test followed by Papanicolaou smear. RESULTS: For the screening group, the sensitivity was 0.69 and the specificity was 0.93; the area under the ROC curve was 0.81. The LR+ and LR- were 10.24 and 0.34, respectively. For the diagnosis group, the sensitivity was 0.88 and the specificity was 0.78; the area under the ROC curve was 0.83. The LR+ and LR- were 4.06 and 0.14, respectively. Sequential testing showed little or no improvement over the combination testing. CONCLUSIONS: The HCII test in the screening group had a greater LR+ for the detection of CIN 2/3 or worse. HCII testing may be an additional screening tool for cervical cancer in women 30 years and older.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A minor groove binder (MGB) TaqMan real-time PCR assay was developed for the detection of respiratory syncytial virus (RSV) in clinical specimens. Upon evaluation of the assay, notable differences were observed in the overall fluorescent response obtained from RSV positive specimens, with some linear amplification curves deviating only slightly from baseline fluorescence. Sequencing of the probes targets in these RSV strains revealed single base mismatches with the MGB TaqMan probe. overall, these results highlight the usefulness of MGB TaqMan probes for the detection of mismatches, but suggest that MGB Taqman probes have limitations for routine screening for uncharacterised viral strains. (C) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a novel 2×2 multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) testbed based on an Analog Devices AD9361 highly integrated radio frequency (RF) agile transceiver was specifically implemented for the purpose of estimating and analyzing MIMO-OFDM channel capacity in vehicle-to-infrastructure (V2I) environments using the 920 MHz industrial, scientific, and medical (ISM) band. We implemented two-dimensional discrete cosine transform-based filtering to reduce the channel estimation errors and show its effectiveness on our measurement results. We have also analyzed the effects of channel estimation error on the MIMO channel capacity by simulation. Three different scenarios of subcarrier spacing were investigated which correspond to IEEE 802.11p, Long-Term Evolution (LTE), and Digital Video Broadcasting Terrestrial (DVB-T)(2k) standards. An extensive MIMO-OFDM V2I channel measurement campaign was performed in a suburban environment. Analysis of the measured MIMO channel capacity results as a function of the transmitter-to-receiver (TX-RX) separation distance up to 250 m shows that the variance of the MIMO channel capacity is larger for the near-range line-of-sight (LOS) scenarios than for the long-range non-LOS cases, using a fixed receiver signal-to-noise ratio (SNR) criterion. We observed that the largest capacity values were achieved at LOS propagation despite the common assumption of a degenerated MIMO channel in LOS. We consider that this is due to the large angular spacing between MIMO subchannels which occurs when the receiver vehicle rooftop antennas pass by the fixed transmitter antennas at close range, causing MIMO subchannels to be orthogonal. In addition, analysis on the effects of different subcarrier spacings on MIMO-OFDM channel capacity showed negligible differences in mean channel capacity for the subcarrier spacing range investigated. Measured channels described in this paper are available on request.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 2008 US election has been heralded as the first presidential election of the social media era, but took place at a time when social media were still in a state of comparative infancy; so much so that the most important platform was not Facebook or Twitter, but the purpose-built campaign site my.barackobama.com, which became the central vehicle for the most successful electoral fundraising campaign in American history. By 2012, the social media landscape had changed: Facebook and, to a somewhat lesser extent, Twitter are now well-established as the leading social media platforms in the United States, and were used extensively by the campaign organisations of both candidates. As third-party spaces controlled by independent commercial entities, however, their use necessarily differs from that of home-grown, party-controlled sites: from the point of view of the platform itself, a @BarackObama or @MittRomney is technically no different from any other account, except for the very high follower count and an exceptional volume of @mentions. In spite of the significant social media experience which Democrat and Republican campaign strategists had already accumulated during the 2008 campaign, therefore, the translation of such experience to the use of Facebook and Twitter in their 2012 incarnations still required a substantial amount of new work, experimentation, and evaluation. This chapter examines the Twitter strategies of the leading accounts operated by both campaign headquarters: the ‘personal’ candidate accounts @BarackObama and @MittRomney as well as @JoeBiden and @PaulRyanVP, and the campaign accounts @Obama2012 and @TeamRomney. Drawing on datasets which capture all tweets from and at these accounts during the final months of the campaign (from early September 2012 to the immediate aftermath of the election night), we reconstruct the campaigns’ approaches to using Twitter for electioneering from the quantitative and qualitative patterns of their activities, and explore the resonance which these accounts have found with the wider Twitter userbase. A particular focus of our investigation in this context will be on the tweeting styles of these accounts: the mixture of original messages, @replies, and retweets, and the level and nature of engagement with everyday Twitter followers. We will examine whether the accounts chose to respond (by @replying) to the messages of support or criticism which were directed at them, whether they retweeted any such messages (and whether there was any preferential retweeting of influential or – alternatively – demonstratively ordinary users), and/or whether they were used mainly to broadcast and disseminate prepared campaign messages. Our analysis will highlight any significant differences between the accounts we examine, trace changes in style over the course of the final campaign months, and correlate such stylistic differences with the respective electoral positioning of the candidates. Further, we examine the use of these accounts during moments of heightened attention (such as the presidential and vice-presidential debates, or in the context of controversies such as that caused by the publication of the Romney “47%” video; additional case studies may emerge over the remainder of the campaign) to explore how they were used to present or defend key talking points, and exploit or avert damage from campaign gaffes. A complementary analysis of the messages directed at the campaign accounts (in the form of @replies or retweets) will also provide further evidence for the extent to which these talking points were picked up and disseminated by the wider Twitter population. Finally, we also explore the use of external materials (links to articles, images, videos, and other content on the campaign sites themselves, in the mainstream media, or on other platforms) by the campaign accounts, and the resonance which these materials had with the wider follower base of these accounts. This provides an indication of the integration of Twitter into the overall campaigning process, by highlighting how the platform was used as a means of encouraging the viral spread of campaign propaganda (such as advertising materials) or of directing user attention towards favourable media coverage. By building on comprehensive, large datasets of Twitter activity (as of early October, our combined datasets comprise some 3.8 million tweets) which we process and analyse using custom-designed social media analytics tools, and by using our initial quantitative analysis to guide further qualitative evaluation of Twitter activity around these campaign accounts, we are able to provide an in-depth picture of the use of Twitter in political campaigning during the 2012 US election which will provide detailed new insights social media use in contemporary elections. This analysis will then also be able to serve as a touchstone for the analysis of social media use in subsequent elections, in the USA as well as in other developed nations where Twitter and other social media platforms are utilised in electioneering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transgenic engineering of plants is important in both basic and applied research. However, the expression of a transgene can dwindle over time as the plant's small (s)RNA-guided silencing pathways shut it down. The silencing pathways have evolved as antiviral defence mechanisms, and viruses have co-evolved viral silencing-suppressor proteins (VSPs) to block them. Therefore, VSPs have been routinely used alongside desired transgene constructs to enhance their expression in transient assays. However, constitutive, stable expression of a VSP in a plant usually causes pronounced developmental abnormalities, as their actions interfere with endogenous microRNA-regulated processes, and has largely precluded the use of VSPs as an aid to stable transgene expression. In an attempt to avoid the deleterious effects but obtain the enhancing effect, a number of different VSPs were expressed exclusively in the seeds of Arabidopsis thaliana alongside a three-step transgenic pathway for the synthesis of arachidonic acid (AA), an ω-6 long chain polyunsaturated fatty acid. Results from independent transgenic events, maintained for four generations, showed that the VSP-AA-transformed plants were developmentally normal, apart from minor phenotypes at the cotyledon stage, and could produce 40% more AA than plants transformed with the AA transgene cassette alone. Intriguingly, a geminivirus VSP, V2, was constitutively expressed without causing developmental defects, as it acts on the siRNA amplification step that is not part of the miRNA pathway, and gave strong transgene enhancement. These results demonstrate that VSP expression can be used to protect and enhance stable transgene performance and has significant biotechnological application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A highly sensitive and specific reverse transcription polymerase chain reaction enzyme linked immunosorbent assay (RT-PCR-ELISA) was developed for the objective detection of nucleoprotein (N) gene of peste des petits ruminants (PPR) virus from field outbreaks or experimentally infected sheep. Two primers (IndF and Np4) and one probe (Sp3) available or designed for the amplification/probing of the 'N' gene of PPR virus, were chosen for labeling and use in RT-PCR-ELISA based on highest analytical sensitivity of detection of infective virus or N-gene containing recombinant plasmid, higher nucleotide homology at the primer binding sites of the 'N' gene sequences available and the ability to amplify PPR viral genome from different sources of samples. RT-PCR was performed with unlabeled IndF and Np4 digoxigenin labeled primers followed by a microplate hybridization probe reaction with biotin labeled Sp3 probe. RT-PCR-ELISA was found to be 10-fold more sensitive than the conventional RT-PCR followed by agarose gel based detection of PCR product. Based on the Mean (mean +/- 3S.D.) optical density (OD) values of 47 RT-PCR negative samples, OD values above 0.306 were considered positive in RT-PCR-ELISA. A total of 82 oculo-nasal swabs and tissue samples from suspected PPR cases were analyzed by RT-PCR and RT-PCR-ELISA, which revealed 54.87 and 58.54% positivity, respectively. From an experimentally infected sheep, both RT-PCR and RT-PCR-ELISA could detect the virus from 6 days post-infection up to 9 days in oculo-nasal swabs. On post-mortem, PPR viral genome was detected in spleen, lymph node, lung, heart and liver. The correlation co-efficient between RT-PCR-ELISA OD values and either TCID50 of virus or molecules of DNA was 0.622 and 0.657, respectively. The advantages of RT-PCR-ELISA over the conventional agarose gel based detection of RT-PCR products are discussed. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A liofilização - ou secagem a frio (freeze drying em inglês) - é um complexo processo multiestágios, onde o produto é primeiramente congelado e sua secagem feita através de sublimação. Devido a esta forma de secagem a liofilização se torna um processo atrativo particularmente importante para a estabilização de componentes orgânicos hidratados e lábeis, de origem biológica. O processo de liofilização é amplamente empregado na indústria farmacêutica, porém em termos de gestão de processo, deve-se evitar a liofilização a todo custo, pois o processo possui diversas desvantagens como: equipamentos de alto investimento, alta demanda energética, processo que demanda tempos longos e produtos com facilidade de hidratar e frágeis, necessitando ser cuidadosamente embalados e armazenados. Este trabalho tem como objetivo a diminuição do ciclo de liofilização de uma vacina viral e analisar a possibilidade de carregamento desse produto no liofilizador a temperaturas negativas, de forma a possibilitar o aumento de produtividade. Para tal, foram realizados três experimentos com ciclos de liofilização com 17 e 20h a menos que o ciclo de liofilização da vacina comercial. Os experimentos foram realizados com a mesma formulação do lote comercial e utilizando um liofilizador piloto. As modificações foram realizadas nas propriedades físicas do ciclo de liofilização atual (temperatura, pressão e tempo) e na temperatura de carga do produto, sem alteração na formulação da vacina ou embalagem primária. Amostras do produto liofilizado experimental foram analisadas quanto ao seu aspecto, desempenho, umidade residual, potência e termoestabilidade acelerada segundo os Mínimos Requerimentos da Organização Mundial da Saúde. Todos os resultados analisados estiveram dentro das especificações e próximos ou melhores quando comparados aos lotes comerciais de origem

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The synthesis of cDNA from RNA is challenging due to the inefficiency of reverse transcription (RT). In order to address this, a method was developed known as RT-Bst for sequential RT of RNA and Bst DNA polymerase amplification for enrichment of cDNA in a single tube reaction. Using genomic RNA from bacteriophage MS2, the yield of cDNA produced by RT alone and RT-Bst were compared by analysis of PCR-amplified products. Using random primers a superior performance was observed when amplifying MS2 RNA following RT-Bst compared to RT alone, indicating that greater quantities of cDNA were present after RT-Bst. RT-Bst was also compared with RT alone for their relative ability to produce sufficient cDNA to amplify 8 target regions spanning the respiratory syncytial virus (RSV) genome. Six out of 8 targets were amplified consistently by PCR subsequent to RT-Bst amplification whereas only 3 out of 8 targets could be amplified after RT alone. RSV sequences were selectively amplified using RSV specific primers from a mixed template containing an excess of MS2 RNA in a RT-Bst reaction without amplifying MS2 sequences. This suggests that RT-Bst can be used to amplify RNA sequences non-specifically using random primers and specifically using sequence specific primers and enhances the yield of cDNA when compared to RT alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fault injection is frequently used for the verification and validation of dependable systems. When targeting real time microprocessor based systems the process becomes significantly more complex. This paper proposes two complementary solutions to improve real time fault injection campaign execution, both in terms of performance and capabilities. The methodology is based on the use of the on-chip debug mechanisms present in modern electronic devices. The main objective is the injection of faults in microprocessor memory elements with minimum delay and intrusiveness. Different configurations were implemented and compared in terms of performance gain and logic overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, kriging with covariates is used to model and map the spatial distribution of salinity measurements gathered by an autonomous underwater vehicle in a sea outfall monitoring campaign aiming to distinguish the effluent plume from the receiving waters and characterize its spatial variability in the vicinity of the discharge. Four different geostatistical linear models for salinity were assumed, where the distance to diffuser, the west-east positioning, and the south-north positioning were used as covariates. Sample variograms were fitted by the Mat`ern models using weighted least squares and maximum likelihood estimation methods as a way to detect eventual discrepancies. Typically, the maximum likelihood method estimated very low ranges which have limited the kriging process. So, at least for these data sets, weighted least squares showed to be the most appropriate estimation method for variogram fitting. The kriged maps show clearly the spatial variation of salinity, and it is possible to identify the effluent plume in the area studied. The results obtained show some guidelines for sewage monitoring if a geostatistical analysis of the data is in mind. It is important to treat properly the existence of anomalous values and to adopt a sampling strategy that includes transects parallel and perpendicular to the effluent dispersion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este caso de estudio versa sobre la estrategia de marketing viral utilizada en la campaña de Barack Obama a la Presidencia de Estados Unidos durante el 2008. Se describirá el papel del marketing viral como estrategia de comunicación durante la campaña.