987 resultados para Product variety
Resumo:
Quinoa (Chenopodium quinoa) is a seed crop native to the Andes, that can be used in a variety of food product in a similar manner to cereals. Unlike most plants, quinoa contains protein with a balanced amino acid profile. This makes it an interesting raw material for e.g. dairy product substitutes, a growing market in Europe and U.S. Quinoa can however have unpleasant off-flavours when processed into formulated products. One means of improving the palatability is seed germination. Also, the increased activities of hydrolytic enzymes can have a beneficial influence in food processing. In this thesis, the germination pattern of quinoa was studied, and the influence of quinoa malt was evaluated in a model product. Additionally, to explore its potential for dairy-type products, quinoa protein was isolated from an embryo-enriched milling fraction of non-germinated quinoa and tested for functional and gelation properties. Quinoa seeds imbibed water very rapidly, and most seeds showed radicle protrusion after 8-9 h. The α-amylase activity was very low, and started to increase only after 24 hours of germination in the starchy perisperm. Proteolytic activity was very high in dry ungerminated seeds, and increased slightly over 24 h. A significant fraction of this activity was located in the micropylar endosperm. The incorporation of germinated quinoa in gluten-free bread had no significant effect on the baking properties due to low α-amylase activity. Upon acidification with glucono-δ-lactone, quinoa milk formed a structured gel. The gelation behaviour was further studied using a quinoa protein isolate (QPI) extracted from an embryoenriched milling fraction. QPI required a heat-denaturation step to form gel structures. The heating pH influenced the properties drastically: heating at pH 10.5 led to a dramatic increase in solubility, emulsifying properties, and a formation of a fine-structured gel with a high storage modulus (G') when acidified. Heating at pH 8.5 varied very little from the unheated protein in terms of functional properties, and only formed a randomly aggregated coagulum with a low G'. Further study of changes over the course of heating showed that the mechanism of heat-denaturation and aggregation indeed varied largely depending on pH. The large difference in gelation behaviour may be related to the nature of aggregates formed during heating. To conclude, germination for increased enzyme activities may not be feasible, but the structure-forming properties of quinoa protein could possibly be exploited in dairy-type products.
Resumo:
The primary aim of this thesis is to analyse legal and governance issues in the use of Environmental NPR-PPMs, particularly those aiming to promote sustainable practices or to protect natural resources. NPR-PPMs have traditionally been thought of as being incompatible with the rules of the World Trade Organization (WTO). However, the issue remains untouched by WTO adjudicatory bodies. One can suggest that WTO adjudicatory bodies may want to leave this issue to the Members, but the analysis of the case law also seems to indicate that the question of legality of NPR-PPMs has not been brought ‘as such’ in dispute settlement. This thesis advances the argument that despite the fact that the legal status of NPR-PPMs remains unsettled, during the last decades adjudicatory bodies have been scrutinising environmental measures based on NPR-PPMs just as another expression of the regulatory autonomy of the Members. Though NPR-PPMs are regulatory choices associated with a wide range of environmental concerns, trade disputes giving rise to questions related to the legality of process-based measures have been mainly associated with the protection of marine wildlife (i.e., fishing techniques threatening or affecting animal species). This thesis argues that environmental objectives articulated as NPR-PPMs can indeed qualify as legitimate objectives both under the GATT and the TBT Agreement. However, an important challenge for the their compatibility with WTO law relate to aspects associated with arbitrary or unjustifiable discrimination. In the assessment of discrimination procedural issues play an important role. This thesis also elucidates other important dimensions to the issue from the perspective of global governance. One of the arguments advanced in this thesis is that a comprehensive analysis of environmental NPR-PPMs should consider not only their role in what is regarded as trade barriers (governmental and market-driven), but also their significance in global objectives such as the transition towards a green economy and sustainable patterns of consumption and production.
Resumo:
This paper describes a methodology for detecting anomalies from sequentially observed and potentially noisy data. The proposed approach consists of two main elements: 1) filtering, or assigning a belief or likelihood to each successive measurement based upon our ability to predict it from previous noisy observations and 2) hedging, or flagging potential anomalies by comparing the current belief against a time-varying and data-adaptive threshold. The threshold is adjusted based on the available feedback from an end user. Our algorithms, which combine universal prediction with recent work on online convex programming, do not require computing posterior distributions given all current observations and involve simple primal-dual parameter updates. At the heart of the proposed approach lie exponential-family models which can be used in a wide variety of contexts and applications, and which yield methods that achieve sublinear per-round regret against both static and slowly varying product distributions with marginals drawn from the same exponential family. Moreover, the regret against static distributions coincides with the minimax value of the corresponding online strongly convex game. We also prove bounds on the number of mistakes made during the hedging step relative to the best offline choice of the threshold with access to all estimated beliefs and feedback signals. We validate the theory on synthetic data drawn from a time-varying distribution over binary vectors of high dimensionality, as well as on the Enron email dataset. © 1963-2012 IEEE.
Resumo:
Some luxury goods manufacturers offer limited editions of their products, whereas some others market multiple product lines. Researchers have found that reference groups shape consumer evaluations of these product categories. Yet little empirical research has examined how reference groups affect the product line decisions of firms. Indeed, in a field setting it is quite a challenge to isolate reference group effects from contextual effects and correlated effects. In this paper, we propose a parsimonious model that allows us to study how reference groups influence firm behavior and that lends itself to experimental analysis. With the aid of the model we investigate the behavior of consumers in a laboratory setting where we can focus on the reference group effects after controlling for the contextual and correlated effects. The experimental results show that in the presence of strong reference group effects, limited editions and multiple products can help improve firms' profits. Furthermore, the trends in the purchase decisions of our participants point to the possibility that they are capable of introspecting close to two steps of thinking at the outset of the game and then learning through reinforcement mechanisms. © 2010 INFORMS.
Resumo:
The growth and proliferation of invasive bacteria in engineered systems is an ongoing problem. While there are a variety of physical and chemical processes to remove and inactivate bacterial pathogens, there are many situations in which these tools are no longer effective or appropriate for the treatment of a microbial target. For example, certain strains of bacteria are becoming resistant to commonly used disinfectants, such as chlorine and UV. Additionally, the overuse of antibiotics has contributed to the spread of antibiotic resistance, and there is concern that wastewater treatment processes are contributing to the spread of antibiotic resistant bacteria.
Due to the continually evolving nature of bacteria, it is difficult to develop methods for universal bacterial control in a wide range of engineered systems, as many of our treatment processes are static in nature. Still, invasive bacteria are present in many natural and engineered systems, where the application of broad acting disinfectants is impractical, because their use may inhibit the original desired bioprocesses. Therefore, to better control the growth of treatment resistant bacteria and to address limitations with the current disinfection processes, novel tools that are both specific and adaptable need to be developed and characterized.
In this dissertation, two possible biological disinfection processes were investigated for use in controlling invasive bacteria in engineered systems. First, antisense gene silencing, which is the specific use of oligonucleotides to silence gene expression, was investigated. This work was followed by the investigation of bacteriophages (phages), which are viruses that are specific to bacteria, in engineered systems.
For the antisense gene silencing work, a computational approach was used to quantify the number of off-targets and to determine the effects of off-targets in prokaryotic organisms. For the organisms of
Regarding the work with phages, the disinfection rates of bacteria in the presence of phages was determined. The disinfection rates of
In addition to determining disinfection rates, the long-term bacterial growth inhibition potential was determined for a variety of phages with both Gram-negative and Gram-positive bacteria. It was determined, that on average, phages can be used to inhibit bacterial growth for up to 24 h, and that this effect was concentration dependent for various phages at specific time points. Additionally, it was found that a phage cocktail was no more effective at inhibiting bacterial growth over the long-term than the best performing phage in isolation.
Finally, for an industrial application, the use of phages to inhibit invasive
In conclusion, this dissertation improved the current methods for designing antisense gene silencing targets for prokaryotic organisms, and characterized phages from an engineering perspective. First, the current design strategy for antisense targets in prokaryotic organisms was improved through the development of an algorithm that minimized the number of off-targets. For the phage work, a framework was developed to predict the disinfection rates in terms of the initial phage and bacterial concentrations. In addition, the long-term bacterial growth inhibition potential of multiple phages was determined for several bacteria. In regard to the phage application, phages were shown to protect both final product yields and yeast concentrations during fermentation. Taken together, this work suggests that the rational design of phage treatment is possible and further work is needed to expand on this foundation.
Resumo:
The central product of the DRAMA (Dynamic Re-Allocation of Meshes for parallel Finite Element Applications) project is a library comprising a variety of tools for dynamic re-partitioning of unstructured Finite Element (FE) applications. The input to the DRAMA library is the computational mesh, and corresponding costs, partitioned into sub-domains. The core library functions then perform a parallel computation of a mesh re-allocation that will re-balance the costs based on the DRAMA cost model. We discuss the basic features of this cost model, which allows a general approach to load identification, modelling and imbalance minimisation. Results from crash simulations are presented which show the necessity for multi-phase/multi-constraint partitioning components
Resumo:
Reliability of electronic parts is a major concern for many manufacturers, since early failures in the field can cost an enormous amount to repair - in many cases far more than the original cost of the product. A great deal of effort is expended by manufacturers to determine the failure rates for a process or the fraction of parts that will fail in a period of time. It is widely recognized that the traditional approach to reliability predictions for electronic systems are not suitable for today's products. This approach, based on statistical methods only, does not address the physics governing the failure mechanisms in electronic systems. This paper discusses virtual prototyping technologies which can predict the physics taking place and relate this to appropriate failure mechanisms. Simulation results illustrate the effect of temperature on the assembly process of an electronic package and the lifetime of a flip-chip package.
Resumo:
The electronics industry and the problems associated with the cooling of microelectronic equipment are developing rapidly. Thermal engineers now find it necessary to consider the complex area of equipment cooling at some level. This continually growing industry also faces heightened pressure from consumers to provide electronic product miniaturization, which in itself increases the demand for accurate thermal management predictions to assure product reliability. Computational fluid dynamics (CFD) is considered a powerful and almost essential tool for the design, development and optimization of engineering applications. CFD is now widely used within the electronics packaging design community to thermally characterize the performance of both the electronic component and system environment. This paper discusses CFD results for a large variety of investigated turbulence models. Comparison against experimental data illustrates the predictive accuracy of currently used models and highlights the growing demand for greater mathematical modelling accuracy with regards to thermal characterization. Also a newly formulated low Reynolds number (i.e. transitional) turbulence model is proposed with emphasis on hybrid techniques.
Resumo:
Based on the IMP research tradition this paper regards relationships and networks as key issues in the product development and supply management agenda. Within business networks, co-development is only possible to be analysed when emphasis is placed on interdependences and interactive relationships. Co-development usually implies close relationships that allow companies to rely on each other's resources. Close relationships imply interdependences, which may improve companies' technical and product development. By looking at the actual interactions - between a UK company and its Chinese suppliers - that led to an innovative solution and a successful product launch, evolving relationship patterns are identified and analysed in a case study. Both the literature review and case study findings highlight the importance of the 'guanxi' concept (meaning interpersonal relationships in Mandarin) when analysing business-to-business networks in China. Hence, it is suggested that guanxi-based thinking and acting should be incorporated into the interaction model when considering business networking that embrace China. 'Guanxi' broadens the validity of the interaction model, in terms of geographical proximity, and deepens its theoretical base. The case study provides valuable insights for supply management under a product development context in China. In practice, the main point of interest is that Chinese suppliers are important 'resource' providers as well as 'network' providers. Hence, it is suggested that guanxi practice should be reflected into theoretical developments.
Resumo:
Western manufacturing companies are developing innovative ways of delivering value that competes with the low cost paradigm. One such strategy is to deliver not only products, but systems that are closely aligned with the customer value proposition. These systems are comprised of integrated products and services, and are referred to as Product-Service Systems (PSS). A key challenge in PSS is supporting the design activity. In one sense, PSS design is a further extension of concurrent engineering that requires front-end input from the additional downstream sources of product service and maintenance. However, simply developing products and service packages is not sufficient: the new design challenge is the integrated system. This paper describes the development of a PSS data structure that can support this integrated design activity. The data structure is implemented in a knowledge base using the Protégé knowledge base editor.
Resumo:
This paper presents novel collaboration methods implemented using a centralized client/server product development integration architecture, and a decentralized peer-to-peer network for smaller and larger companies using open source solutions. The product development integration architecture has been developed for the integration of disparate technologies and software systems for the benefit of collaborative work teams in design and manufacturing. This will facilitate the communication of early design and product development within a distributed and collaborative environment. The novelty of this work is the introduction of an‘out-of-box’ concept which provides a standard framework and deploys this utilizing a proprietary state-of-the-art product lifecycle management system (PLM). The term ‘out-of-box’ means to modify the product development and business processes to suit the technologies rather than vice versa. The key business benefits of adopting such an approach are a rapidly reconfigurable network and minimal requirements for software customization to avoid systems instability
Resumo:
Today, the key to commercial success in manufacturing is the timely development of new products that are not only functionally fit for purpose but offer high performance and quality throughout their entire lifecycle. In principle, this demands the introduction of a fully developed and optimised product from the outset. To accomplish this, manufacturing companies must leverage existing knowledge in their current technical, manufacturing and service capabilities. This is especially true in the field of tolerance selection and application, the subject area of this research. Tolerance knowledge must be readily available and deployed as an integral part of the product development process. This paper describes a methodology and framework,currently under development in a UK manufacturer, to achieve this objective.