13 resultados para Product Launch. Industrial Markets. Segmentation. Conjoint Analysis. Technology Push
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The objective of this study was to evaluate the push-out bond strength of fiberglass resin reinforced bonded with five ionomer cements. Also, the interface between cement and dentin was inspected by means of SEM. Fifty human canines were chose after rigorous scrutiny process, endodontically treated and divided randomly into five groups (n = 3) according to cement tested: Group I – Ionoseal (VOCO), Group II – Fugi I (GC), Group III – Fugi II Improved (GC), Group IV – Rely X Luting 2 (3M ESPE), Group V – Ketac Cem (3M ESPE). The post-space was prepared to receive a fiberglass post, which was tried before cementation process. No dentin or post surface pretreatment was carried out. After post bonding, all roots were cross-sectioned to acquire 3 thin-slices (1 mm) from three specific regions of tooth (cervical, medium and apical). A Universal test machine was used to carry out the push-out test with cross-head speed set to 0.5mm/mim. All failed specimens were observed under optical microscope to identify the failure mode. Representative specimens from each group was inspected under SEM. The data were analyzed by Kolmogorov-Smirnov and Levene’s tests and by two-way ANOVA, and Tukey’s port hoc test at a significance level of 5%. It was compared the images obtained for determination of types of failures more occurred in different levels. SEM inspection displayed that all cements filled the space between post and dentin, however, some imperfections such bubles and voids were noticed in all groups in some degree of extension. The push-out bond strength showed that cement Ketac Cem presented significant higher results when compared to the Ionoseal (P = 0.02). There were no statistical significant differences among other cements.
Resumo:
Little is known about the situational contexts in which individuals consume processed sources of dietary sugars. This study aimed to describe the situational contexts associated with the consumption of sweetened food and drink products in a Catholic Middle Eastern Canadian community. A two-stage exploratory sequential mixed-method design was employed with a rationale of triangulation. In stage 1 (n = 62), items and themes describing the situational contexts of sweetened food and drink product consumption were identified from semi-structured interviews and were used to develop the content for the Situational Context Instrument for Sweetened Product Consumption (SCISPC). Face validity, readability and cultural relevance of the instrument were assessed. In stage 2 (n = 192), a cross-sectional study was conducted and exploratory factor analysis was used to examine the structure of themes that emerged from the qualitative analysis as a means of furthering construct validation. The SCISPC reliability and predictive validity on the daily consumption of sweetened products were also assessed. In stage 1, six themes and 40-items describing the situational contexts of sweetened product consumption emerged from the qualitative analysis and were used to construct the first draft of the SCISPC. In stage 2, factor analysis enabled the clarification and/or expansion of the instrument's initial thematic structure. The revised SCISPC has seven factors and 31 items describing the situational contexts of sweetened product consumption. Initial validation of the instrument indicated it has excellent internal consistency and adequate test-retest reliability. Two factors of the SCISPC had predictive validity for the daily consumption of total sugar from sweetened products (Snacking and Energy demands) while the other factors (Socialization, Indulgence, Constraints, Visual Stimuli and Emotional needs) were rather associated to occasional consumption of these products.
Resumo:
This article investigates the effect of product market liberalisation on employment allowing for interactions between policies and institutions in product and labour markets. Using panel data for OECD countries over the period 19802002, we present evidence that product market deregulation is more effective at the margin when labour market regulation is high. The data also suggest that product market liberalisation may promote employment-enhancing labour market reforms.
Resumo:
It is well known that control systems are the core of electronic differential systems (EDSs) in electric vehicles (EVs)/hybrid HEVs (HEVs). However, conventional closed-loop control architectures do not completely match the needed ability to reject noises/disturbances, especially regarding the input acceleration signal incoming from the driver's commands, which makes the EDS (in this case) ineffective. Due to this, in this paper, a novel EDS control architecture is proposed to offer a new approach for the traction system that can be used with a great variety of controllers (e. g., classic, artificial intelligence (AI)-based, and modern/robust theory). In addition to this, a modified proportional-integral derivative (PID) controller, an AI-based neuro-fuzzy controller, and a robust optimal H-infinity controller were designed and evaluated to observe and evaluate the versatility of the novel architecture. Kinematic and dynamic models of the vehicle are briefly introduced. Then, simulated and experimental results were presented and discussed. A Hybrid Electric Vehicle in Low Scale (HELVIS)-Sim simulation environment was employed to the preliminary analysis of the proposed EDS architecture. Later, the EDS itself was embedded in a dSpace 1103 high-performance interface board so that real-time control of the rear wheels of the HELVIS platform was successfully achieved.
Resumo:
The growing demands for industrial products are imposing an increasingly intense level of competitiveness on the industrial operations. In the meantime, the convergence of information technology (IT) and automation technology (AT) is showing itself to be a tool of great potential for the modernization and improvement of industrial plants. However, for this technology fully to achieve its potential, several obstacles need to be overcome, including the demonstration of the reasoning behind estimations of benefits, investments and risks used to plan the implementation of corporative technology solutions. This article focuses on the evolutionary development of planning and adopting processes of IT & AT convergence. It proposes the incorporation of IT & AT convergence practices into Lean Thinking/Six Sigma, via the method used for planning the convergence of technological activities, known as the Smarter Operation Transformation (SOT) methodology. This article illustrates the SOT methodology through its application in a Brazilian company in the sector of consumer goods. In this application, it is shown that with IT & AT convergence is possible with low investment, in order to reduce the risk of not achieving the goals of key indicators.
Resumo:
The performance of an anaerobic sequencing-batch biofilm reactor (ASBBR-laboratory scale- 14L) containing biomass immobilized on coal was evaluated for the removal of elevated concentrations of sulfate (between 200 and 3,000 mg SO4-2.L-1) from industrial wastewater effluents. The ASBBR was shown to be efficient for removal of organic material (between 90% and 45%) and sulfate (between 95% and 85%). The microbiota adhering to the support medium was analyzed by amplified ribosomal DNA restriction analysis (ARDRA). The ARDRA profiles for the Bacteria and Archaea domains proved to be sensitive for the determination of microbial diversity and were consistent with the physical-chemical monitoring analysis of the reactor. At 3,000 mg SO4-2.L-1, there was a reduction in the microbial diversity of both domains and also in the removal efficiencies of organic material and sulfate.
Resumo:
The performance of an anaerobic sequencing-batch biofilm reactor (ASBBR- laboratory scale- 14L )containing biomass immobilized on coal was evaluated for the removal of elevated concentrations of sulfate (between 200 and 3,000 mg SO4-2·L-1) from industrial wastewater effluents. The ASBBR was shown to be efficient for removal of organic material (between 90% and 45%) and sulfate (between 95% and 85%). The microbiota adhering to the support medium was analyzed by amplified ribosomal DNA restriction analysis (ARDRA). The ARDRA profiles for the Bacteria and Archaea domains proved to be sensitive for the determination of microbial diversity and were consistent with the physical-chemical monitoring analysis of the reactor. At 3,000 mg SO4-2·L-1, there was a reduction in the microbial diversity of both domains and also in the removal efficiencies of organic material and sulfate.
Resumo:
The microstructural behavior of industrial standardized cocoa butter samples and cocoa butter samples from three different Brazilian states is compared. The cocoa butters were characterized by their microstructural patterns, crystallization kinetics and polymorphic habits. The evaluation of these parameters aided in establishing relationships between the chemical compositions and crystallization behavior of the samples, as well as differentiating them in terms of technological and industrial potential for use in tropical regions.
Resumo:
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Resumo:
A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.
Resumo:
In this study, Canoparmelia texana lichenized fungi species was used as a passive biomonitor of the atmospheric pollution from the industrial city of So Mateus do Sul, PR, Brazil. Lichen samples collected from tree barks were cleaned, freeze-dried and analyzed by neutron activation analysis. Comparisons were made between the element concentrations obtained in lichens from this city and that from a clean area of Atlantic Forest in Intervales Park, SP. The high concentrations of elements As, Ca, Co, Cr, Fe, Hf, Sb, and Th found in lichens could be attributed to the emissions from a ceramic and an oil shale plants.
Resumo:
The concept of industrial clustering has been studied in-depth by policy makers and researchers from many fields, mainly due to the competitive advantages it may bring to regional economies. Companies often take part in collaborative initiatives with local partners while also taking advantage of knowledge spillovers to benefit from locating in a cluster. Thus, Knowledge Management (KM) and Performance Management (PM) have become relevant topics for policy makers and cluster associations when undertaking collaborative initiatives. Taking this into account, this paper aims to explore the interplay between both topics using a case study conducted in a collaborative network formed within a cluster. The results show that KM should be acknowledged as a formal area of cluster management so that PM practices can support knowledge-oriented initiatives and therefore make better use of the new knowledge created. Furthermore, tacit and explicit knowledge resulting from PM practices needs to be stored and disseminated throughout the cluster as a way of improving managerial practices and regional strategic direction. Knowledge Management Research & Practice (2012) 10, 368-379. doi:10.1057/kmrp.2012.23
Resumo:
Dynamic texture is a recent field of investigation that has received growing attention from computer vision community in the last years. These patterns are moving texture in which the concept of selfsimilarity for static textures is extended to the spatiotemporal domain. In this paper, we propose a novel approach for dynamic texture representation, that can be used for both texture analysis and segmentation. In this method, deterministic partially self-avoiding walks are performed in three orthogonal planes of the video in order to combine appearance and motion features. We validate our method on three applications of dynamic texture that present interesting challenges: recognition, clustering and segmentation. Experimental results on these applications indicate that the proposed method improves the dynamic texture representation compared to the state of the art.