932 resultados para Product Launch. Industrial Markets. Segmentation. Conjoint Analysis. Technology Push


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Animal neocentromeres are defined as ectopic centromeres that have formed in non-centromeric locations and avoid some of the features, like the DNA satellite sequence, that normally characterize canonical centromeres. Despite this, they are stable functional centromeres inherited through generations. The only existence of neocentromeres provide convincing evidence that centromere specification is determined by epigenetic rather than sequence-specific mechanisms. For all this reasons, we used them as simplified models to investigate the molecular mechanisms that underlay the formation and the maintenance of functional centromeres. We collected human cell lines carrying neocentromeres in different positions. To investigate the region involved in the process at the DNA sequence level we applied a recent technology that integrates Chromatin Immuno-Precipitation and DNA microarrays (ChIP-on-chip) using rabbit polyclonal antibodies directed against CENP-A or CENP-C human centromeric proteins. These DNA binding-proteins are required for kinetochore function and are exclusively targeted to functional centromeres. Thus, the immunoprecipitation of DNA bound by these proteins allows the isolation of centromeric sequences, including those of the neocentromeres. Neocentromeres arise even in protein-coding genes region. We further analyzed if the increased scaffold attachment sites and the corresponding tighter chromatin of the region involved in the neocentromerization process still were permissive or not to transcription of within encoded genes. Centromere repositioning is a phenomenon in which a neocentromere arisen without altering the gene order, followed by the inactivation of the canonical centromere, becomes fixed in population. It is a process of chromosome rearrangement fundamental in evolution, at the bases of speciation. The repeat-free region where the neocentromere initially forms, progressively acquires extended arrays of satellite tandem repeats that may contribute to its functional stability. In this view our attention focalized to the repositioned horse ECA11 centromere. ChIP-on-chip analysis was used to define the region involved and SNPs studies, mapping within the region involved into neocentromerization, were carried on. We have been able to describe the structural polymorphism of the chromosome 11 centromeric domain of Caballus population. That polymorphism was seen even between homologues chromosome of the same cells. That discovery was the first described ever. Genomic plasticity had a fundamental role in evolution. Centromeres are not static packaged region of genomes. The key question that fascinates biologists is to understand how that centromere plasticity could be combined to the stability and maintenance of centromeric function. Starting from the epigenetic point of view that underlies centromere formation, we decided to analyze the RNA content of centromeric chromatin. RNA, as well as secondary chemically modifications that involve both histones and DNA, represents a good candidate to guide somehow the centromere formation and maintenance. Many observations suggest that transcription of centromeric DNA or of other non-coding RNAs could affect centromere formation. To date has been no thorough investigation addressing the identity of the chromatin-associated RNAs (CARs) on a global scale. This prompted us to develop techniques to identify CARs in a genome-wide approach using high-throughput genomic platforms. The future goal of this study will be to focalize the attention on what strictly happens specifically inside centromere chromatin.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This dissertation comprises three essays on the topic of industrial organization. The first essay considers how different intellectual property systems can affect the incentives to invest in R&D when innovation is cumulative. I introduce a distinction between plain and sophisticated technological knowledge, which plays a crucial role in determining how different appropriability rules affect the incentives to innovate. I argue that the positive effect of weak intellectual property regimes on the sharing of intermediate technological knowledge vanishes when technological knowledge is sophisticated, as is likely to be the case in many high tech industries. The second essay analyzes a two-sided market for news where advertisers may pay a media outlet to conceal negative information on the quality of their own product (paying positive to avoid negative) and/or to disclose negative information on the quality of their competitors products (paying positive to go negative). It is shown that whether advertisers have negative consequences on the accuracy of media reports or not, ultimately depends on the extent of correlation among advertisers products. The third essay considers the role of social learning in the diffusion of a new technology. A population of agents can choose between two risky technologies: an old one for which they know the expected outcome, and a new one for which they have only a prior. Different environments are confronted. In the benchmark case agents are isolated and can perform costly experiments to infer the quality of the new technology. In the other cases agents are settled in a network and can observe the outcomes of neighbors. We observe that in expectations the quality of the new technology may be overestimated when there is a network spread of information.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this thesis effects of plasma actuators based on Dielectric Barrier Discharge (DBD) technology over a NACA 0015 bidimensional airfoil have been analyzed in an experimental way, at low Reynolds number. Work developed on thesis has been carried on in partnership with the Department of Electrical Engineering of Università di Bologna, inside Wind Tunnel of the Applied Aerodynamic Laboratory of Aerospace Engineering faculty. In order to verify the effectiveness of these active control devices, the analysis has shown how actuators succeed in prevent boundary layer separation only in certain conditions af angle of attack and Reynolds numbers. Moreover, in this thesis actuators’ chordwise position effect has been also analyzed, together with the influence of steady and unsteady operations.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study is focused on radio-frequency inductively coupled thermal plasma (ICP) synthesis of nanoparticles, combining experimental and modelling approaches towards process optimization and industrial scale-up, in the framework of the FP7-NMP SIMBA European project (Scaling-up of ICP technology for continuous production of Metallic nanopowders for Battery Applications). First the state of the art of nanoparticle production through conventional and plasma routes is summarized, then results for the characterization of the plasma source and on the investigation of the nanoparticle synthesis phenomenon, aiming at highlighting fundamental process parameters while adopting a design oriented modelling approach, are presented. In particular, an energy balance of the torch and of the reaction chamber, employing a calorimetric method, is presented, while results for three- and two-dimensional modelling of an ICP system are compared with calorimetric and enthalpy probe measurements to validate the temperature field predicted by the model and used to characterize the ICP system under powder-free conditions. Moreover, results from the modeling of critical phases of ICP synthesis process, such as precursor evaporation, vapour conversion in nanoparticles and nanoparticle growth, are presented, with the aim of providing useful insights both for the design and optimization of the process and on the underlying physical phenomena. Indeed, precursor evaporation, one of the phases holding the highest impact on industrial feasibility of the process, is discussed; by employing models to describe particle trajectories and thermal histories, adapted from the ones originally developed for other plasma technologies or applications, such as DC non-transferred arc torches and powder spherodization, the evaporation of micro-sized Si solid precursor in a laboratory scale ICP system is investigated. Finally, a discussion on the role of thermo-fluid dynamic fields on nano-particle formation is presented, as well as a study on the effect of the reaction chamber geometry on produced nanoparticle characteristics and process yield.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This Doctoral Thesis unfolds into a collection of three distinct papers that share an interest in institutional theory and technology transfer. Taking into account that organizations are increasingly exposed to a multiplicity of demands and pressures, we aim to analyze what renders this situation of institutional complexity more or less difficult to manage for organizations, and what makes organizations more or less successful in responding to it. The three studies offer a novel contribution both theoretically and empirically. In particular, the first paper “The dimensions of organizational fields for understanding institutional complexity: A theoretical framework” is a theoretical contribution that tries to better understand the relationship between institutional complexity and fields by providing a framework. The second article “Beyond institutional complexity: The case of different organizational successes in confronting multiple institutional logics” is an empirical study which aims to explore the strategies that allow organizations facing multiple logics to respond more successfully to them. The third work “ How external support may mitigate the barriers to university-industry collaboration” is oriented towards practitioners and presents a case study about technology transfer in Italy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coastal sand dunes represent a richness first of all in terms of defense from the sea storms waves and the saltwater ingression; moreover these morphological elements constitute an unique ecosystem of transition between the sea and the land environment. The research about dune system is a strong part of the coastal sciences, since the last century. Nowadays this branch have assumed even more importance for two reasons: on one side the born of brand new technologies, especially related to the Remote Sensing, have increased the researcher possibilities; on the other side the intense urbanization of these days have strongly limited the dune possibilities of development and fragmented what was remaining from the last century. This is particularly true in the Ravenna area, where the industrialization united to the touristic economy and an intense subsidence, have left only few dune ridges residual still active. In this work three different foredune ridges, along the Ravenna coast, have been studied with Laser Scanner technology. This research didn’t limit to analyze volume or spatial difference, but try also to find new ways and new features to monitor this environment. Moreover the author planned a series of test to validate data from Terrestrial Laser Scanner (TLS), with the additional aim of finalize a methodology to test 3D survey accuracy. Data acquired by TLS were then applied on one hand to test some brand new applications, such as Digital Shore Line Analysis System (DSAS) and Computational Fluid Dynamics (CFD), to prove their efficacy in this field; on the other hand the author used TLS data to find any correlation with meteorological indexes (Forcing Factors), linked to sea and wind (Fryberger's method) applying statistical tools, such as the Principal Component Analysis (PCA).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The present thesis work was performed in the frame of ESEO (European Student Earth Orbiter) project. The activities that are described in this document were carried out in the Microsatellites and Space Micro systems Lab led by Professor Paolo Tortora and in ALMASpace company facilities. The thesis deals with ESEO structural analysis, at system and unit level, and verification: after determining the design limit loads to be applied to the spacecraft as an envelope of different launchers load profiles, a finite element structural analysis was performed on the model of the satellite in order to ensure the capability to withstand the loads encountered during the launch; all the analyses were performed according to ESA standards and using the software MSC NASTRAN SIMXPERT. Amplification factors were derived and used to determine loads to be considered at unit level. In particular structural analyses were carried out on the GPS unit, the payload developed for ESEO by students of University of Bologna and results were used in the preparation of GPS payload design definition file. As for the verification phase a study on the panels and inserts to be used in the spacecraft was performed: different designs were created exploiting methods to optimize weight and mechanical behavior. The configurations have been analyzed and results compared to select the final design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the time, Twitter has become a fundamental source of information for news. As a one step forward, researchers have tried to analyse if the tweets contain predictive power. In the past, in financial field, a lot of research has been done to propose a function which takes as input all the tweets for a particular stock or index s, analyse them and predict the stock or index price of s. In this work, we take an alternative approach: using the stock price and tweet information, we investigate following questions. 1. Is there any relation between the amount of tweets being generated and the stocks being exchanged? 2. Is there any relation between the sentiment of the tweets and stock prices? 3. What is the structure of the graph that describes the relationships between users?

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. LAY ABSTRACT: Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Many industrial solids processes require the production of disperse particles. In industries such as food, personal care, and pharmaceuticals, particle formation is widely used to produce solid products or to separate substances in intermediate process steps. The most important characteristics known to impact the effectiveness of a solid product are purity, size, internal structure, and morphology. These characteristics are essential to maintain optimal operation of subsequent process steps and for obtaining the desired high quality product. This thesis aims to aid in the advancement of particle production technology by (1) investigating the use of a vibrating orifice aerosol generator (VOAG) for collecting data to predict particle attributes including morphology, size, and internal structure as a function of processing parameters such as solvent, solution concentration, air flow rate, and initial droplet size, as well as to (2) determine the extent to which uniform droplet evaporation can be a tool to achieve novel particle morphologies, controlled sizes, or internal structures (crystallinity and crystal form). Experimental results for succinic acid, L-serine, and L-glutamic acid suggest that particles of controlled characteristics can indeed be produced by this method. Analysis by scanning electron microscopy (SEM), nanoindentation, and X-ray diffraction (XRD) shows that various sizes, internal structures, and morphologies can be obtained using the VOAG. Furthermore, unique morphologies and unexpected internal structures were able to be achieved for succinic acid, providing an added benefit to particle formation by this method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Motivation: Array CGH technologies enable the simultaneous measurement of DNA copy number for thousands of sites on a genome. We developed the circular binary segmentation (CBS) algorithm to divide the genome into regions of equal copy number (Olshen {\it et~al}, 2004). The algorithm tests for change-points using a maximal $t$-statistic with a permutation reference distribution to obtain the corresponding $p$-value. The number of computations required for the maximal test statistic is $O(N^2),$ where $N$ is the number of markers. This makes the full permutation approach computationally prohibitive for the newer arrays that contain tens of thousands markers and highlights the need for a faster. algorithm. Results: We present a hybrid approach to obtain the $p$-value of the test statistic in linear time. We also introduce a rule for stopping early when there is strong evidence for the presence of a change. We show through simulations that the hybrid approach provides a substantial gain in speed with only a negligible loss in accuracy and that the stopping rule further increases speed. We also present the analysis of array CGH data from a breast cancer cell line to show the impact of the new approaches on the analysis of real data. Availability: An R (R Development Core Team, 2006) version of the CBS algorithm has been implemented in the ``DNAcopy'' package of the Bioconductor project (Gentleman {\it et~al}, 2004). The proposed hybrid method for the $p$-value is available in version 1.2.1 or higher and the stopping rule for declaring a change early is available in version 1.5.1 or higher.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Metals price risk management is a key issue related to financial risk in metal markets because of uncertainty of commodity price fluctuation, exchange rate, interest rate changes and huge price risk either to metals’ producers or consumers. Thus, it has been taken into account by all participants in metal markets including metals’ producers, consumers, merchants, banks, investment funds, speculators, traders and so on. Managing price risk provides stable income for both metals’ producers and consumers, so it increases the chance that a firm will invest in attractive projects. The purpose of this research is to evaluate risk management strategies in the copper market. The main tools and strategies of price risk management are hedging and other derivatives such as futures contracts, swaps and options contracts. Hedging is a transaction designed to reduce or eliminate price risk. Derivatives are financial instruments, whose returns are derived from other financial instruments and they are commonly used for managing financial risks. Although derivatives have been around in some form for centuries, their growth has accelerated rapidly during the last 20 years. Nowadays, they are widely used by financial institutions, corporations, professional investors, and individuals. This project is focused on the over-the-counter (OTC) market and its products such as exotic options, particularly Asian options. The first part of the project is a description of basic derivatives and risk management strategies. In addition, this part discusses basic concepts of spot and futures (forward) markets, benefits and costs of risk management and risks and rewards of positions in the derivative markets. The second part considers valuations of commodity derivatives. In this part, the options pricing model DerivaGem is applied to Asian call and put options on London Metal Exchange (LME) copper because it is important to understand how Asian options are valued and to compare theoretical values of the options with their market observed values. Predicting future trends of copper prices is important and would be essential to manage market price risk successfully. Therefore, the third part is a discussion about econometric commodity models. Based on this literature review, the fourth part of the project reports the construction and testing of an econometric model designed to forecast the monthly average price of copper on the LME. More specifically, this part aims at showing how LME copper prices can be explained by means of a simultaneous equation structural model (two-stage least squares regression) connecting supply and demand variables. A simultaneous econometric model for the copper industry is built: {█(Q_t^D=e^((-5.0485))∙P_((t-1))^((-0.1868) )∙〖GDP〗_t^((1.7151) )∙e^((0.0158)∙〖IP〗_t ) @Q_t^S=e^((-3.0785))∙P_((t-1))^((0.5960))∙T_t^((0.1408))∙P_(OIL(t))^((-0.1559))∙〖USDI〗_t^((1.2432))∙〖LIBOR〗_((t-6))^((-0.0561))@Q_t^D=Q_t^S )┤ P_((t-1))^CU=e^((-2.5165))∙〖GDP〗_t^((2.1910))∙e^((0.0202)∙〖IP〗_t )∙T_t^((-0.1799))∙P_(OIL(t))^((0.1991))∙〖USDI〗_t^((-1.5881))∙〖LIBOR〗_((t-6))^((0.0717) Where, Q_t^D and Q_t^Sare world demand for and supply of copper at time t respectively. P(t-1) is the lagged price of copper, which is the focus of the analysis in this part. GDPt is world gross domestic product at time t, which represents aggregate economic activity. In addition, industrial production should be considered here, so the global industrial production growth that is noted as IPt is included in the model. Tt is the time variable, which is a useful proxy for technological change. A proxy variable for the cost of energy in producing copper is the price of oil at time t, which is noted as POIL(t ) . USDIt is the U.S. dollar index variable at time t, which is an important variable for explaining the copper supply and copper prices. At last, LIBOR(t-6) is the 6-month lagged 1-year London Inter bank offering rate of interest. Although, the model can be applicable for different base metals' industries, the omitted exogenous variables such as the price of substitute or a combined variable related to the price of substitutes have not been considered in this study. Based on this econometric model and using a Monte-Carlo simulation analysis, the probabilities that the monthly average copper prices in 2006 and 2007 will be greater than specific strike price of an option are defined. The final part evaluates risk management strategies including options strategies, metal swaps and simple options in relation to the simulation results. The basic options strategies such as bull spreads, bear spreads and butterfly spreads, which are created by using both call and put options in 2006 and 2007 are evaluated. Consequently, each risk management strategy in 2006 and 2007 is analyzed based on the day of data and the price prediction model. As a result, applications stemming from this project include valuing Asian options, developing a copper price prediction model, forecasting and planning, and decision making for price risk management in the copper market.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The demand for consumer goods in the developing world continues to rise as populations and economies grow. As designers, manufacturers, and consumers look for ways to address this growing demand, many are considering the possibilities of 3D printing. Due to 3D printing’s flexibility and relative mobility, it is speculated that 3D printing could help to meet the growing demands of the developing world. While the merits and challenges of distributed manufacturing with 3D printing have been presented, little work has been done to determine the types of products that would be appropriate for such manufacturing. Inspired by the author’s two years of Peace Corps service in the Tanzania and the need for specialty equipment for various projects during that time, an in-depth literature search is undertaken to better understand and summarize the process and capabilities of 3D printing. Human-centered design considerations are developed to focus on the product desirability, the technical feasibility, and the financial viability of using 3D printing within Tanzania. Beginning with concerns of what Tanzanian consumers desire, many concerns later arise in regards to the feasibility of creating products that would be sufficient in strength and quality for the demands of developing world consumers. It is only after these concerns are addressed that the viability of products can be evaluated from an economic perspective. The larger impacts of a product beyond its use are vital in determining how it will affect the social, economic, and environmental well-being of a developing nation such as Tanzania. Thus technology specific criteria are necessary for assessing and quantifying the broader impacts that a 3D-printed product can have within its ecosystem, and appropriate criteria are developed for this purpose. Both sets of criteria are then demonstrated and tested while evaluating the desirability, feasibility, viability, and sustainability of printing a piece of equipment required for the author’s Peace Corps service: a set of Vernier calipers. Required for science educators throughout the country, specialty equipment such as calipers initially appear to be an ideal candidate for 3D printing, though ultimately the printing of calipers is not recommended due to current restrictions in the technology. By examining more specific challenges and opportunities of the products 3D printing can produce, it can be better determined what place 3D printing will have in manufacturing for the developing world. Furthermore, the considerations outlined in this paper could be adapted for other manufacturing technologies and regions of the world, as human centered design and sustainability will be critical in determining how to supply the developing world with the consumer goods it demands.