947 resultados para Design Principles


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developmental evaluation (DE) is an evaluation approach that aims to support the development of an innovation (Patton, 1994, 2011). This aim is achieved through supporting clients’ information needs through evaluative inquiry as they work to develop and refine the innovation. While core concepts and principles are beginning to be articulated and refined, challenges remain as to how to focus a developmental evaluation beyond those knowledge frameworks most immediate to clients to support innovation development. Anchoring a DE in knowledge frameworks other than those of the clients might direct attention to issues not yet obvious to clients, but which might further the goal of supporting innovation development if attended to. Drawing concepts and practices from the field of design may be one avenue with which to inform developmental evaluation in achieving its aim. Through a case study methodology, this research seeks to understand the nuances of operationalizing the guiding principles of DE as well as to investigate the utility, feasibility, and consequences of integrating design concepts and practices into developmental evaluation (design-informed developmental evaluation, “DI-DE”). It does so by documenting the efforts of a design-informed developmental evaluator and a task force of educators and researchers in a Faculty of Education as they work to develop a graduate-level education program. A systematic review into those purposeful efforts made to introduce DI-DE thinking into task force deliberations, and an analysis into the responses and consequences of those efforts shed light on what it had meant to practice DI-DE. As a whole, this research on evaluation is intended to further contemporary thinking about the closely coupled relationship between program development and evaluation in complex and dynamic environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The importance of political parties for contemporary representative democracies is beyond dispute. Despite their significance for state-level democracy, political parties continue to be regarded as oligarchical and to be criticised because of their internal practices. For this reason, intra-party democracy (IPD) warrants in-depth analysis. This thesis investigates IPD in Turkey, primarily from the perspective of participatory democracy, with the purpose of suggesting reforms to the Turkish Political Parties Law (TPPL). Turkish political parties and Turkish party regulation provide an interesting case because there is a significant difference between mature democracies and Turkey regarding IPD regulation. IPD in established democracies has always been regarded as a private concern of parties and has been left unregulated. IPD in Turkey, by contrast, is provided for both by the constitution and the TPPL. Although IPD is a constitutional and legal requirement in Turkey, however, political parties in fact display a high level of non-democratic administration. The main reason is that the TPPL only pays lip service to the idea of IPD and requires no specific measures apart from establishing a party congress with a representative form of democracy. By establishing and holding party congresses, political parties are perceived as conforming to the requirements of IPD under the law. In addition, the contested nature of democracy as a concept has impeded the creation of efficacious legal principles. Thus, the existing party law fails to tackle the lack of IPD within political parties and, for this reason, is in need of reform. Furthermore, almost every Turkish party’s own constitution highlights the importance of IPD and promises IPD. However, these declared commitments to IPD in their constitutions alone, especially in countries where the democratic culture is weak, are unlikely to make much difference in practice. Accordingly, external regulation is necessary to ensure the protection of the rights and interests of the party members with regards to their participation in intra-party decision-making processes. Nevertheless, in spite of a general consensus in favour of reforming the TPPL, a lack of consensus exists as to what kind of reforms should be adopted. This thesis proposes that reforming the TPPL in line with an approach based on participatory democracy could provide better IPD within Turkish political parties, citing as evidence comparative case studies of the participatory practices for policy-making, leadership selection and candidate selection in mature democracies. This thesis also analyses membership registration and the effect of state funding on IPD, which are highly problematic in Turkey and represent impediments to the flourishing of IPD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, novel analog-to-digital and digital-to-analog generalized time-interleaved variable bandpass sigma-delta modulators are designed, analysed, evaluated and implemented that are suitable for high performance data conversion for a broad-spectrum of applications. These generalized time-interleaved variable bandpass sigma-delta modulators can perform noise-shaping for any centre frequency from DC to Nyquist. The proposed topologies are well-suited for Butterworth, Chebyshev, inverse-Chebyshev and elliptical filters, where designers have the flexibility of specifying the centre frequency, bandwidth as well as the passband and stopband attenuation parameters. The application of the time-interleaving approach, in combination with these bandpass loop-filters, not only overcomes the limitations that are associated with conventional and mid-band resonator-based bandpass sigma-delta modulators, but also offers an elegant means to increase the conversion bandwidth, thereby relaxing the need to use faster or higher-order sigma-delta modulators. A step-by-step design technique has been developed for the design of time-interleaved variable bandpass sigma-delta modulators. Using this technique, an assortment of lower- and higher-order single- and multi-path generalized A/D variable bandpass sigma-delta modulators were designed, evaluated and compared in terms of their signal-to-noise ratios, hardware complexity, stability, tonality and sensitivity for ideal and non-ideal topologies. Extensive behavioural-level simulations verified that one of the proposed topologies not only used fewer coefficients but also exhibited greater robustness to non-idealties. Furthermore, second-, fourth- and sixth-order single- and multi-path digital variable bandpass digital sigma-delta modulators are designed using this technique. The mathematical modelling and evaluation of tones caused by the finite wordlengths of these digital multi-path sigmadelta modulators, when excited by sinusoidal input signals, are also derived from first principles and verified using simulation and experimental results. The fourth-order digital variable-band sigma-delta modulator topologies are implemented in VHDL and synthesized on Xilinx® SpartanTM-3 Development Kit using fixed-point arithmetic. Circuit outputs were taken via RS232 connection provided on the FPGA board and evaluated using MATLAB routines developed by the author. These routines included the decimation process as well. The experiments undertaken by the author further validated the design methodology presented in the work. In addition, a novel tunable and reconfigurable second-order variable bandpass sigma-delta modulator has been designed and evaluated at the behavioural-level. This topology offers a flexible set of choices for designers and can operate either in single- or dual-mode enabling multi-band implementations on a single digital variable bandpass sigma-delta modulator. This work is also supported by a novel user-friendly design and evaluation tool that has been developed in MATLAB/Simulink that can speed-up the design, evaluation and comparison of analog and digital single-stage and time-interleaved variable bandpass sigma-delta modulators. This tool enables the user to specify the conversion type, topology, loop-filter type, path number and oversampling ratio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about the future role of engineers and the education needed to address these changes in technology as well as emerging priorities from societal to environmental challenges. In response to these challenges, a new design module was created for undergraduate engineering students to design and build temporary shelters for a wide variety of end users from refugees, to the homeless and children. Even though the module provided guidance on principles of design thinking and methods for observing users needs through field studies, the students found it difficult to respond to needs of specific end users but instead focused more on purely technical issues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper describes the design and implementation of a novel low cost virtual rugby decision making interactive for use in a visitor centre. Original laboratory-based experimental work in decision making in rugby, using a virtual reality headset [1] is adapted for use in a public visitor centre, with consideration given to usability, costs, practicality and health and safety. Movement of professional rugby players was captured and animated within a virtually recreated stadium. Users then interact with these virtual representations via use of a lowcost sensor (Microsoft Kinect) to attempt to block them. Retaining the principles of perception and action, egocentric viewpoint, immersion, sense of presence, representative design and game design the system delivers an engaging and effective interactive to illustrate the underlying scientific principles of deceptive movement. User testing highlighted the need for usability, system robustness, fair and accurate scoring, appropriate level of difficulty and enjoyment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper aims to crystallize recent research performed at the University of Worcester to investigate the feasibility of using the commercial game engine ‘Unreal Tournament 2004’ (UT2004) to produce ‘Educational Immersive Environments’ (EIEs) suitable for education and training. Our research has been supported by the UK Higher Education Academy. We discuss both practical and theoretical aspects of EIEs. The practical aspects include the production of EIEs to support high school physics education, the education of architects, and the learning of literacy by primary school children. This research is based on the development of our novel instructional medium, ‘UnrealPowerPoint’. Our fundamental guiding principles are that, first, pedagogy must inform technology, and second, that both teachers and pupils should be empowered to produce educational materials. Our work is informed by current educational theories such as constructivism, experiential learning and socio-cultural approaches as well as elements of instructional design and game principles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Products and services explicitly intended to influence users’ behaviour are increasingly being proposed to reduce environmental impact and for other areas of social benefit. Designing such interventions often involves adopting and adapting principles from other contexts where behaviour change has been studied. The ‘design pattern’ form, used in software engineering and HCI, and originally developed in architecture, offers benefits for this transposition process. This article introduces the Design with Intent toolkit, an idea generation method using a design pattern form to help designers address sustainable behaviour problems. The article also reports on exploratory workshops in which participants used the toolkit to generate concepts for redesigning everyday products—kettles, curtains, printers and bathroom sinks/taps—to reduce the environmental impact of use. The concepts are discussed, along with observations of how the toolkit was used by participants, suggesting usability improvements to incorporate in future versions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Variability management is one of the major challenges in software product line adoption, since it needs to be efficiently managed at various levels of the software product line development process (e.g., requirement analysis, design, implementation, etc.). One of the main challenges within variability management is the handling and effective visualization of large-scale (industry-size) models, which in many projects, can reach the order of thousands, along with the dependency relationships that exist among them. These have raised many concerns regarding the scalability of current variability management tools and techniques and their lack of industrial adoption. To address the scalability issues, this work employed a combination of quantitative and qualitative research methods to identify the reasons behind the limited scalability of existing variability management tools and techniques. In addition to producing a comprehensive catalogue of existing tools, the outcome form this stage helped understand the major limitations of existing tools. Based on the findings, a novel approach was created for managing variability that employed two main principles for supporting scalability. First, the separation-of-concerns principle was employed by creating multiple views of variability models to alleviate information overload. Second, hyperbolic trees were used to visualise models (compared to Euclidian space trees traditionally used). The result was an approach that can represent models encompassing hundreds of variability points and complex relationships. These concepts were demonstrated by implementing them in an existing variability management tool and using it to model a real-life product line with over a thousand variability points. Finally, in order to assess the work, an evaluation framework was designed based on various established usability assessment best practices and standards. The framework was then used with several case studies to benchmark the performance of this work against other existing tools.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Once the preserve of university academics and research laboratories with high-powered and expensive computers, the power of sophisticated mathematical fire models has now arrived on the desk top of the fire safety engineer. It is a revolution made possible by parallel advances in PC technology and fire modelling software. But while the tools have proliferated, there has not been a corresponding transfer of knowledge and understanding of the discipline from expert to general user. It is a serious shortfall of which the lack of suitable engineering courses dealing with the subject is symptomatic, if not the cause. The computational vehicles to run the models and an understanding of fire dynamics are not enough to exploit these sophisticated tools. Too often, they become 'black boxes' producing magic answers in exciting three-dimensional colour graphics and client-satisfying 'virtual reality' imagery. As well as a fundamental understanding of the physics and chemistry of fire, the fire safety engineer must have at least a rudimentary understanding of the theoretical basis supporting fire models to appreciate their limitations and capabilities. The five day short course, "Principles and Practice of Fire Modelling" run by the University of Greenwich attempt to bridge the divide between the expert and the general user, providing them with the expertise they need to understand the results of mathematical fire modelling. The course and associated text book, "Mathematical Modelling of Fire Phenomena" are aimed at students and professionals with a wide and varied background, they offer a friendly guide through the unfamiliar terrain of mathematical modelling. These concepts and techniques are introduced and demonstrated in seminars. Those attending also gain experience in using the methods during "hands-on" tutorial and workshop sessions. On completion of this short course, those participating should: - be familiar with the concept of zone and field modelling; - be familiar with zone and field model assumptions; - have an understanding of the capabilities and limitations of modelling software packages for zone and field modelling; - be able to select and use the most appropriate mathematical software and demonstrate their use in compartment fire applications; and - be able to interpret model predictions. The result is that the fire safety engineer is empowered to realise the full value of mathematical models to help in the prediction of fire development, and to determine the consequences of fire under a variety of conditions. This in turn enables him or her to design and implement safety measures which can potentially control, or at the very least reduce the impact of fire.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This design-research thesis suggests that the improvement of North East Street performances by using Complete Streets, Green Street, Place Making and Context Sensitive Solution principles and practices. Heavily used by a variety of users, often conflicting with one another, University of Maryland Campus Drive would benefit from a major planning and design amelioration to meet the increasing demands of serving as a city main street. The goal of this thesis project is to prioritize the benefits for pedestrians in the right-of-way and improve the pedestrian experience. This goal also responds to the recent North East Street Extension Phrase I of economic renaissances. The goal of this design-research thesis will be achieved focusing on four aspects. First, the plans and designs will suggest to building mixed use blocks, increase the diversity of street economic types and convenience of people’s living. Second, design and plans will propose bike lanes, separate driving lanes from sidewalks and bike lanes by street tree planters, and narrow driving lanes to reduce vehicular traffic volume and speed in order to reduce pedestrian and vehicle conflicts. Third, plans and designs will introduce bioswales, living walls and raingardens to treat and reuse rain water. Finally, the plans and designs will seek to preserve local culture and history by adding murals and farmers market. The outcome of the design-research thesis project is expected to serve as an example of implementing Complete Streets, Green Street, Place Making and Context Sensitive Solution principles and practices in urban landscape, where transportation, environment and social needs interact with each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents an investigation on endoscopic optical coherence tomography (OCT). As a noninvasive imaging modality, OCT emerges as an increasingly important diagnostic tool for many clinical applications. Despite of many of its merits, such as high resolution and depth resolvability, a major limitation is the relatively shallow penetration depth in tissue (about 2∼3 mm). This is mainly due to tissue scattering and absorption. To overcome this limitation, people have been developing many different endoscopic OCT systems. By utilizing a minimally invasive endoscope, the OCT probing beam can be brought to the close vicinity of the tissue of interest and bypass the scattering of intervening tissues so that it can collect the reflected light signal from desired depth and provide a clear image representing the physiological structure of the region, which can not be disclosed by traditional OCT. In this thesis, three endoscope designs have been studied. While they rely on vastly different principles, they all converge to solve this long-standing problem.

A hand-held endoscope with manual scanning is first explored. When a user is holding a hand- held endoscope to examine samples, the movement of the device provides a natural scanning. We proposed and implemented an optical tracking system to estimate and record the trajectory of the device. By registering the OCT axial scan with the spatial information obtained from the tracking system, one can use this system to simply ‘paint’ a desired volume and get any arbitrary scanning pattern by manually waving the endoscope over the region of interest. The accuracy of the tracking system was measured to be about 10 microns, which is comparable to the lateral resolution of most OCT system. Targeted phantom sample and biological samples were manually scanned and the reconstructed images verified the method.

Next, we investigated a mechanical way to steer the beam in an OCT endoscope, which is termed as Paired-angle-rotation scanning (PARS). This concept was proposed by my colleague and we further developed this technology by enhancing the longevity of the device, reducing the diameter of the probe, and shrinking down the form factor of the hand-piece. Several families of probes have been designed and fabricated with various optical performances. They have been applied to different applications, including the collector channel examination for glaucoma stent implantation, and vitreous remnant detection during live animal vitrectomy.

Lastly a novel non-moving scanning method has been devised. This approach is based on the EO effect of a KTN crystal. With Ohmic contact of the electrodes, the KTN crystal can exhibit a special mode of EO effect, termed as space-charge-controlled electro-optic effect, where the carrier electron will be injected into the material via the Ohmic contact. By applying a high voltage across the material, a linear phase profile can be built under this mode, which in turn deflects the light beam passing through. We constructed a relay telescope to adapt the KTN deflector into a bench top OCT scanning system. One of major technical challenges for this system is the strong chromatic dispersion of KTN crystal within the wavelength band of OCT system. We investigated its impact on the acquired OCT images and proposed a new approach to estimate and compensate the actual dispersion. Comparing with traditional methods, the new method is more computational efficient and accurate. Some biological samples were scanned by this KTN based system. The acquired images justified the feasibility of the usage of this system into a endoscopy setting. My research above all aims to provide solutions to implement an OCT endoscope. As technology evolves from manual, to mechanical, and to electrical approaches, different solutions are presented. Since all have their own advantages and disadvantages, one has to determine the actual requirements and select the best fit for a specific application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.

The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.

The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).

The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.

The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.

In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overarching theme of this thesis is mesoscale optical and optoelectronic design of photovoltaic and photoelectrochemical devices. In a photovoltaic device, light absorption and charge carrier transport are coupled together on the mesoscale, and in a photoelectrochemical device, light absorption, charge carrier transport, catalysis, and solution species transport are all coupled together on the mesoscale. The work discussed herein demonstrates that simulation-based mesoscale optical and optoelectronic modeling can lead to detailed understanding of the operation and performance of these complex mesostructured devices, serve as a powerful tool for device optimization, and efficiently guide device design and experimental fabrication efforts. In-depth studies of two mesoscale wire-based device designs illustrate these principles—(i) an optoelectronic study of a tandem Si|WO3 microwire photoelectrochemical device, and (ii) an optical study of III-V nanowire arrays.

The study of the monolithic, tandem, Si|WO3 microwire photoelectrochemical device begins with development and validation of an optoelectronic model with experiment. This study capitalizes on synergy between experiment and simulation to demonstrate the model’s predictive power for extractable device voltage and light-limited current density. The developed model is then used to understand the limiting factors of the device and optimize its optoelectronic performance. The results of this work reveal that high fidelity modeling can facilitate unequivocal identification of limiting phenomena, such as parasitic absorption via excitation of a surface plasmon-polariton mode, and quick design optimization, achieving over a 300% enhancement in optoelectronic performance over a nominal design for this device architecture, which would be time-consuming and challenging to do via experiment.

The work on III-V nanowire arrays also starts as a collaboration of experiment and simulation aimed at gaining understanding of unprecedented, experimentally observed absorption enhancements in sparse arrays of vertically-oriented GaAs nanowires. To explain this resonant absorption in periodic arrays of high index semiconductor nanowires, a unified framework that combines a leaky waveguide theory perspective and that of photonic crystals supporting Bloch modes is developed in the context of silicon, using both analytic theory and electromagnetic simulations. This detailed theoretical understanding is then applied to a simulation-based optimization of light absorption in sparse arrays of GaAs nanowires. Near-unity absorption in sparse, 5% fill fraction arrays is demonstrated via tapering of nanowires and multiple wire radii in a single array. Finally, experimental efforts are presented towards fabrication of the optimized array geometries. A hybrid self-catalyzed and selective area MOCVD growth method is used to establish morphology control of GaP nanowire arrays. Similarly, morphology and pattern control of nanowires is demonstrated with ICP-RIE of InP. Optical characterization of the InP nanowire arrays gives proof of principle that tapering and multiple wire radii can lead to near-unity absorption in sparse arrays of InP nanowires.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cemeterial units, are places of social practices of everyday life and worship and the tomb where nostalgia can be externalized and the memory of the deceased revered. In Western societies we can find a category of artifacts meant to evoke the memory or honor the dead. In this paper we we mention three examples of products that enabled a reflection on the concepts that gave rise to their ways, and that risks to fit them into a new "material culture", in that it may have created a break with the traditional system codes and standards shared by companies, and its manifestations in relation to the physical creation of this category of products. This work offers a reflection on the Design Products.What probably makes it special is the field where it is located: the design of products in one post mortem memory. Usually made of granite rock or marble, have the form of plate or tablet, open book or rolled sheet. On one side have a photograph of the person who intend to honor and inscriptions. The thought of inherent design of this work put on one side the intricate set of emotions that this type of product can generate, and other components more affordable, and concerning the form, function and object interactions with users and with use environments. In the definition of the problem it was regarded as mandatory requirements: differentiation, added value and durability as key objectives.The first two should be manifested in the various components / product attributes. The aesthetic and material/structural durability of product necessarily imply the introduction of qualifying terms and quantitative weights, which positively influence the generation and evaluation of concepts based on the set of 10 principles for the project that originated a matrix as a tool to aid designing products. The concrete definition of a target audience was equally important. At this stage, the collaboration of other experts in the fields of psychology and sociology as disciplines with particular ability to understand individuals and social phenomena respectively was crucial. It was concluded that a product design to honor someone post mortem, should abandon the more traditional habits and customs to focus on identifying new audiences. Although at present it can be considered a niche market, it is believed that in the future may grow as well as their interest in this type of products.