42 resultados para directed technology adoption
Resumo:
Biopulping fundamentals, technology and mechanisms are reviewed in this article. Mill evaluation of Eucalyptus grandis wood chips biotreated by Ceriporiopsis subvermispora on a 50-tonne pilot-plant demonstrated that equivalent energy savings can be obtained in lab- and mill-scale biopulping. Some drawbacks concerning limited improvements in pulp strength and contamination of the chip pile with opportunist fungi have been observed. The use of pre-cultured wood chips as inoculum seed for the biotreatment process minimized contamination problems related to the use of blended mycelium and corn-steep liquor in the inoculation step. Alkaline wash restored part of the brightness in biopulps and marketable brightness values were obtained by one-stage bleaching with 5% H2O2 when bio-TMP pulps were under evaluation. Considering the current scenario, the understanding of biopulping mechanisms has gained renewed attention because more resistant and competitive fungal species could be selected with basis on a function-directed screening project. A series of studies aimed to elucidate structural changes in lignin during wood biodegradation by C. subvermispora had indicated that lignin depolymerization occurs during initial stages of wood biotreatment. Aromatic hydroxyls did not increase with the split of aryl-ether linkages, suggesting that the ether-cleavage-products remain as quitione-type structures. On the other hand, cellulose is more resistant to the attack by C subvermispora. MnP-initiated lipid peroxidation reactions have been proposed to explain degradation of non-phenolic lignin substructures by C subvermispora, while the lack of cellobiohydrolases and the occurrence of systems able to suppress Fenton`s reaction in the cultures have explained non-efficient cellulose degradation by this biopulping fungus. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
Many authors point out that the front-end of new product development (NPD) is a critical success factor in the NPD process and that numerous companies face difficulties in carrying it out appropriately. Therefore, it is important to develop new theories and proposals that support the effective implementation of this earliest phase of NPD. This paper presents a new method to support the development of front-end activities based on integrating technology roadmapping (TRM) and project portfolio management (PPM). This new method, called the ITP Method, was implemented at a small Brazilian high-tech company in the nanotechnology industry to explore the integration proposal. The case study demonstrated that the ITP Method provides a systematic procedure for the fuzzy front-end and integrates innovation perspectives into a single roadmap, which allows for a better alignment of business efforts and communication of product innovation goals. Furthermore, the results indicated that the method may also improve quality, functional integration and strategy alignment. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a proposal for a Quality Management System for a generic GNSS Surveying Company as an alternative for management and service quality improvements. As a result of the increased demand for GNSS measurements, a large number of new or restructured companies were established to operate in that market. Considering that GNSS surveying is a new process, some changes must be performed in order to accommodate the old surveying techniques and the old fashioned management to the new reality. This requires a new management model that must be based on a well-described procedure sequence aiming at the Total Management Quality for the company. The proposed Quality Management System was based on the requirements of the Quality System ISO 9000:2000, applied to the whole company, focusing on the productive process of GNSS surveying work.
Resumo:
This work presents a case study on technology assessment for power quality improvement devices. A system compatibility test protocol for power quality mitigation devices was developed in order to evaluate the functionality of three-phase voltage restoration devices. In order to validate this test protocol, the micro-DVR, a reduced power development platform for DVR (dynamic voltage restorer) devices, was tested and the results are discussed based on voltage disturbances standards. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
A green ceramic tape micro-heat exchanger was developed using Low Temperature Co-fired Ceramics technology (LTCC). The device was designed by using Computational Aided Design software and simulations were made using a Computational Fluid Dynamics package (COMSOL Multiphysics) to evaluate the homogeneity of fluid distribution in the microchannels. Four geometries were proposed and simulated in two and three dimensions to show that geometric details directly affect the distribution of velocity in the micro-heat exchanger channels. The simulation results were quite useful for the design of the microfluidic device. The micro-heat exchanger was then constructed using the LTCC technology and is composed of five thermal exchange plates in cross-flow arrangement and two connecting plates, with all plates stacked to form a device with external dimensions of 26 x 26 x 6 mm(3).
Resumo:
The increasing adoption of information systems in healthcare has led to a scenario where patient information security is more and more being regarded as a critical issue. Allowing patient information to be in jeopardy may lead to irreparable damage, physically, morally, and socially to the patient, potentially shaking the credibility of the healthcare institution. Medical images play a crucial role in such context, given their importance in diagnosis, treatment, and research. Therefore, it is vital to take measures in order to prevent tampering and determine their provenance. This demands adoption of security mechanisms to assure information integrity and authenticity. There are a number of works done in this field, based on two major approaches: use of metadata and use of watermarking. However, there still are limitations for both approaches that must be properly addressed. This paper presents a new method using cryptographic means to improve trustworthiness of medical images, providing a stronger link between the image and the information on its integrity and authenticity, without compromising image quality to the end user. Use of Digital Imaging and Communications in Medicine structures is also an advantage for ease of development and deployment.
Resumo:
Functional magnetic resonance imaging (fMRI) has become an important tool in Neuroscience due to its noninvasive and high spatial resolution properties compared to other methods like PET or EEG. Characterization of the neural connectivity has been the aim of several cognitive researches, as the interactions among cortical areas lie at the heart of many brain dysfunctions and mental disorders. Several methods like correlation analysis, structural equation modeling, and dynamic causal models have been proposed to quantify connectivity strength. An important concept related to connectivity modeling is Granger causality, which is one of the most popular definitions for the measure of directional dependence between time series. In this article, we propose the application of the partial directed coherence (PDC) for the connectivity analysis of multisubject fMRI data using multivariate bootstrap. PDC is a frequency domain counterpart of Granger causality and has become a very prominent tool in EEG studies. The achieved frequency decomposition of connectivity is useful in separating interactions from neural modules from those originating in scanner noise, breath, and heart beating. Real fMRI dataset of six subjects executing a language processing protocol was used for the analysis of connectivity. Hum Brain Mapp 30:452-461, 2009. (C) 2007 Wiley-Liss, Inc.
Isolation and analysis of bioactive isoflavonoids and chalcone from a new type of Brazilian propolis
Resumo:
Activity-directed fractionation and purification processes were employed to identify isoflavonoids with antioxidant and antimicrobial activities from Brazilian red propolis. Crude propolis was extracted with ethanol (80%. v/v) and fractioned by liquid-liquid extraction technique using hexane and chloroform. Since chloroform fraction showed strong antioxidant and antimicrobial activities it was purified and isolated using various chromatographic techniques. Comparing our spectral data (UV, NMR, and mass spectrometry) with values found in the literature, we identified two bioactive isoflavonoids (vestitol and neovestitol), together with one chalcone (isoliquiritigenin). Vestitol presented higher antioxidant activity against beta-carotene consumption than neovestitol. The antimicrobial activity of these three compounds against Staphylococcus aureus, Streptococcus mutans, and Actinomyces naeslundii was evaluated and we concluded that isoliquiritigenin was the most active one with lower MIC, ranging from 15.6 to 62.5 mu g/mL. Our results showed that Brazilian red propolis has biologically active isoflavonoids that may be used as a mild antioxidant and antimicrobial for food preservation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.
Resumo:
Precision agriculture (PA) technologies are being applied to crops in Brazil, which are important to ensure Brazil`s position in agricultural production. However, there are no studies available at present to indicate the extent to which PA technologies are being used in the country. Therefore, the main objective of this research was to investigate how the sugar-ethanol industry in So Paulo state, which produces 60% of the domestic sugarcane, is adopting and using these techniques. For this purpose, primary data were used, which were obtained from a questionnaire sent to all companies operating in the sugar-ethanol industry in the region. The aim was to determine to what extent these companies are adopting and using PA technologies, and also to promote a more in-depth discussion of the topic within the sugar-ethanol industry. Information was obtained on the features of the companies, on sources of information that they use for adopting these technologies, on their impacts on these companies and on obstacles hindering their adoption. The main conclusions of this research suggest that companies that adopt and use PA practices reap benefits, such as managerial improvements, higher yields, lower costs, minimization of environmental impacts and improvements in sugarcane quality.
Resumo:
By applying a directed evolution methodology specific enzymatic characteristics can be enhanced, but to select mutants of interest from a large mutant bank, this approach requires high throughput screening and facile selection. To facilitate such primary screening of enhanced clones, an expression system was tested that uses a green fluorescent protein (GFP) tag from Aequorea victoria linked to the enzyme of interest. As GFP`s fluorescence is readily measured, and as there is a 1:1 molar correlation between the target protein and GFP, the concept proposed was to determine whether GFP could facilitate primary screening of error-prone PCR (EPP) clones. For this purpose a thermostable beta-glucosidase (BglA) from Fervidobacterium sp. was used as a model enzyme. A vector expressing the chimeric protein BglA-GFP-6XHis was constructed and the fusion protein purified and characterized. When compared to the native proteins, the components of the fusion displayed modified characteristics, such as enhanced GFP thermostability and a higher BglA optimum temperature. Clones carrying mutant BglA proteins obtained by EPP, were screened based on the BglA/GFP activity ratio. Purified tagged enzymes from selected clones resulted in modified substrate specificity.
Resumo:
Imaging Spectroscopy (IS) is a promising tool for studying soil properties in large spatial domains. Going from point to image spectrometry is not only a journey from micro to macro scales, but also a long stage where problems such as dealing with data having a low signal-to-noise level, contamination of the atmosphere, large data sets, the BRDF effect and more are often encountered. In this paper we provide an up-to-date overview of some of the case studies that have used IS technology for soil science applications. Besides a brief discussion on the advantages and disadvantages of IS for studying soils, the following cases are comprehensively discussed: soil degradation (salinity, erosion, and deposition), soil mapping and classification, soil genesis and formation, soil contamination, soil water content, and soil swelling. We review these case studies and suggest that the 15 data be provided to the end-users as real reflectance and not as raw data and with better signal-to-noise ratios than presently exist. This is because converting the raw data into reflectance is a complicated stage that requires experience, knowledge, and specific infrastructures not available to many users, whereas quantitative spectral models require good quality data. These limitations serve as a barrier that impedes potential end-users, inhibiting researchers from trying this technique for their needs. The paper ends with a general call to the soil science audience to extend the utilization of the IS technique, and it provides some ideas on how to propel this technology forward to enable its widespread adoption in order to achieve a breakthrough in the field of soil science and remote sensing. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Background/purpose The continuous advancement in cosmetic science has led to an increasing demand for the development of non-invasive, reliable scientific techniques directed toward claim substantiation, which is of utmost relevance, to obtain data regarding the efficacy and safety of cosmetic products. Methods In this work, we used the optical coherence tomography (OCT) technique to produce in vitro transversal section-images of human hair. We also compared the OCT signal before and after chemical treatment with an 18% w/w ammonium thioglycolate solution. Results The mean diameter of the medulla was 29 +/- 7 mu m and the hair diameter was 122 +/- 16 mu m in our samples of standard Afro-ethnic hair. A three-dimensional (3D) image was constructed starting from 601 cross-sectional images (slices). Each slice was taken in steps of 6.0 mu m at eight frames per second, and the entire 3D image was constructed in 60 s. Conclusion It was possible to identify, using the A-scan protocol, the principal structures: the cuticle, cortex and medulla. After chemical treatment, it was not possible to identify the main structures of hair fiber due to index matching promoted by deleterious action of the chemical agent.
Resumo:
In this study, 20 Brazilian public schools have been assessed regarding good manufacturing practices and standard sanitation operating procedures implementation. We used a checklist comprised of 10 parts ( facilities and installations, water supply, equipments and tools, pest control, waste management, personal hygiene, sanitation, storage, documentation, and training), making a total of 69 questions. The implementing modification cost to the found nonconformities was also determined so that it could work with technical data as a based decision-making prioritization. The average nonconformity percentage at schools concerning to prerequisite program was 36%, from which 66% of them own inadequate installations, 65% waste management, 44% regarding documentation, and 35% water supply and sanitation. The initial estimated cost for changing has been U.S.$24,438 and monthly investments of 1.55% on the currently needed invested values. This would result in U.S.$0.015 increase on each served meal cost over the investment replacement within a year. Thus, we have concluded that such modifications are economically feasible and will be considered on technical requirements when prerequisite program implementation priorities are established.
Resumo:
The corporative portals, enabled by Information Technology and Communication tools, provide the integration of heterogeneous data proceeding from internal information systems, which are available for access and sharing of the interested community. They can be considered an important instrument of explicit knowledge evaluation in the. organization, once they allow faster and,safer, information exchanges, enabling a healthful collaborative environment. In the specific case of major Brazilian universities, the corporate portals assume a basic aspect; therefore they offer an enormous variety and amount of information and knowledge, due to the multiplicity of their activities This. study aims to point out important aspects of the explicit knowledge expressed by the searched universities; by the analysis, of the content offered in their corporative portals` This is an exploratory study made through, direct observation of the existing contents in the corporative portals of two public universities as. Well as three private ones. A. comparative analysis of the existing contents in these portals was carried through;. it can be useful to evaluate its use as factor of optimization of the generated explicit knowledge in the university. As results, the existence of important differences, could be verified in the composition and in the content of the corporative portals of the public universities compared to the private institutions. The main differences are about the kind of services and the destination-of the,information that have as focus different public-target. It-could also be concluded that the searched private universities, focus, on the processes related to the attendance of the students, the support for the courses as well as the spreading of information to the public interested in joining the institution; whereas the anal public universities prioritize more specific information, directed to,the dissemination-of the research, developed internally or with institutional objectives.