958 resultados para Bowker Collection Analysis Tool


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Ciência Florestal - FCA

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper, result of a bibliographic review, documentary research and interview with a professional of public relations area (PR), presents a scenario analysis tool and the way how the public relations professional can use it. Are pointed some abilities and strategic skills of this professional that can actually enable him to the prospection, building and analyze of scenarios, from the perspective of relationship with the organization publics. The scenario analysis tool is a strategic way to make some decisions, being used to the analysis of junctures, that can back up organizations' actions and activities, considering possible future results, in other words, the consequences and the developments caused by certain organization attitude or stance. On the end of this paper, we present a propose of strategic plan to Public Relations, using a suggested use for the tool

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Docência para a Educação Básica - FC

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current economic scenario, it is important to the incessant search for improvements in production quality and also in reducing costs. The great competition and technological innovation makes customers stay more and more demanding and seek multiple sources of improvement in production. The work aimed to use the general desirability to optimize a process involving multiple answers in machining experiment. The process of cylindrical turning is one of the most common processes of metal cutting machining and involves several factors, in which will be analysed the best combination of the input factors in machining process, with variable response to surface roughness (Ra) and cutting length (Lc) that vary important answers to measure process efficiency and product quality. The method is a case study, since it involves a study of a tool well addressed in the literature. Data analysis was used in the process of doctoral thesis of Ricardo Penteado on the theme using metaheuristicas combined with different methods of bonding for the optimization of a turning process of multiple responses, then used the desirability and analysis tool. Joint optimization by desirability, the method proposed the following combination of input variables, variable cutting speed at 90 m/min ( -1 level), the breakthrough in 0, 12 mm/revol. ( -1 level), the machining depth should be in 1.6 mm (level 1), gum used must be the TP2500 ( -1 level), in abundant fluid (level 1) and laminated material (level 1) to the maximization of the cutting length (Lc) and minimization of roughness (Ra)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The machining process is so much important in the economic world. Many machining parameters have been studied to maximize results, in terms of cost and lifetime. (decrease of cutting tool wear, improved surface finish, among others). The objective of this study is to evaluate the wear of a ceramic tool in the machining of the aluminum alloy 6005 A. The analysis of the wear of the cutting tools is very important due to its big impact on the final finishing of the piece as a whole. The evaluation took place in two stages, first it was done a detailed study of the literature of the whole machining process, where the study of the formation and swarf classification were among the most important steps in this phase. The second step consisted in the machining of the piece of aluminum 6005 A with a ceramic cutting tool constituded of aluminum oxide and magnesium oxide with silicon carbide impregnation. The swarf generated in this process was then photographed with a Zeiss optical microscope and analyzed for its size and shape. Through this comparison it was concluded that the swarf are generated shear swarfs, shaped like a tangled, fragmented and arcs connected, thus classifying the material as medium difficulty machining. Through the image analysis tool it was concluded that the parameter of lower wear was the: Vc = 500m / min, f = 0.10mm / rev and ap = 0.5mm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pós-graduação em Docência para a Educação Básica - FC

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the current economic scenario, it is important to the incessant search for improvements in production quality and also in reducing costs. The great competition and technological innovation makes customers stay more and more demanding and seek multiple sources of improvement in production. The work aimed to use the general desirability to optimize a process involving multiple answers in machining experiment. The process of cylindrical turning is one of the most common processes of metal cutting machining and involves several factors, in which will be analysed the best combination of the input factors in machining process, with variable response to surface roughness (Ra) and cutting length (Lc) that vary important answers to measure process efficiency and product quality. The method is a case study, since it involves a study of a tool well addressed in the literature. Data analysis was used in the process of doctoral thesis of Ricardo Penteado on the theme using metaheuristicas combined with different methods of bonding for the optimization of a turning process of multiple responses, then used the desirability and analysis tool. Joint optimization by desirability, the method proposed the following combination of input variables, variable cutting speed at 90 m/min ( -1 level), the breakthrough in 0, 12 mm/revol. ( -1 level), the machining depth should be in 1.6 mm (level 1), gum used must be the TP2500 ( -1 level), in abundant fluid (level 1) and laminated material (level 1) to the maximization of the cutting length (Lc) and minimization of roughness (Ra)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The machining process is so much important in the economic world. Many machining parameters have been studied to maximize results, in terms of cost and lifetime. (decrease of cutting tool wear, improved surface finish, among others). The objective of this study is to evaluate the wear of a ceramic tool in the machining of the aluminum alloy 6005 A. The analysis of the wear of the cutting tools is very important due to its big impact on the final finishing of the piece as a whole. The evaluation took place in two stages, first it was done a detailed study of the literature of the whole machining process, where the study of the formation and swarf classification were among the most important steps in this phase. The second step consisted in the machining of the piece of aluminum 6005 A with a ceramic cutting tool constituded of aluminum oxide and magnesium oxide with silicon carbide impregnation. The swarf generated in this process was then photographed with a Zeiss optical microscope and analyzed for its size and shape. Through this comparison it was concluded that the swarf are generated shear swarfs, shaped like a tangled, fragmented and arcs connected, thus classifying the material as medium difficulty machining. Through the image analysis tool it was concluded that the parameter of lower wear was the: Vc = 500m / min, f = 0.10mm / rev and ap = 0.5mm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Background The CHD7 (Chromodomain Helicase DNA binding protein 7) gene encodes a member of the chromodomain family of ATP-dependent chromatin remodeling enzymes. Mutations in the CHD7 gene are found in individuals with CHARGE, a syndrome characterized by multiple birth malformations in several tissues. CHD7 was identified as a binding partner of PBAF complex (Polybromo and BRG Associated Factor containing complex) playing a central role in the transcriptional reprogramming process associated to the formation of multipotent migratory neural crest, a transient cell population associated with the genesis of various tissues. CHD7 is a large gene containing 38 annotated exons and spanning 200 kb of genomic sequence. Although genes containing such number of exons are expected to have several alternative transcripts, there are very few evidences of alternative transcripts associated to CHD7 to date indicating that alternative splicing associated to this gene is poorly characterized. Findings Here, we report the cloning and characterization by experimental and computational studies of a novel alternative transcript of the human CHD7 (named CHD7 CRA_e), which lacks most of its coding exons. We confirmed by overexpression of CHD7 CRA_e alternative transcript that it is translated into a protein isoform lacking most of the domains displayed by the canonical isoform. Expression of the CHD7 CRA_e transcript was detected in normal liver, in addition to the DU145 human prostate carcinoma cell line from which it was originally isolated. Conclusions Our findings indicate that the splicing event associated to the CHD7 CRA_e alternative transcript is functional. The characterization of the CHD7 CRA_e novel isoform presented here not only sets the basis for more detailed functional studies of this isoform, but, also, contributes to the alternative splicing annotation of the CHD7 gene and the design of future functional studies aimed at the elucidation of the molecular functions of its gene products.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As estimações das taxas de inflação são de fundamental importância para os gestores, pois as decisões de investimento estão intimamente ligadas a elas. Contudo, o comportamento inflacionário tende a ser não linear e até mesmo caótico, tornando difícil a sua correta estimação. Essa característica do fenômeno pode tornar imprecisos os modelos mais simples de previsão, acessíveis às pequenas organizações, uma vez que muitos deles necessitam de grandes manipulações de dados e/ou softwares especializados. O presente artigo tem por objetivo avaliar, por meio de análise formal estatística, a eficácia das redes neurais artificiais (RNA) na previsão da inflação, dentro da realidade de organizações de pequeno porte. As RNA são ferramentas adequadas para mensurar os fenômenos inflacionários, por se tratar de aproximações de funções polinomiais, capazes de lidar com fenômenos não lineares. Para esse processo, foram selecionados três modelos básicos de redes neurais artificiais Multi Layer Perceptron, passíveis de implementação a partir de planilhas eletrônicas de código aberto. Os três modelos foram testados a partir de um conjunto de variáveis independentes sugeridas por Bresser-Pereira e Nakano (1984), com defasagem de um, seis e doze meses. Para tal, foram utilizados testes de Wilcoxon, coeficiente de determinação R² e o percentual de erro médio dos modelos. O conjunto de dados foi dividido em dois, sendo um grupo usado para treinamento das redes neurais artificiais, enquanto outro grupo era utilizado para verificar a capacidade de predição dos modelos e sua capacidade de generalização. Com isso, o trabalho concluiu que determinados modelos de redes neurais artificiais têm uma razoável capacidade de predição da inflação no curto prazo e se constituem em uma alternativa razoável para esse tipo de mensuração.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Throughout this research, the whole life cycle of a building will be analyzed, with a special focus on the most common issues that affect the construction sector nowadays, such as safety. In fact, the goal is to enhance the management of the entire construction process in order to reduce the risk of accidents. The contemporary trend is that of researching new tools capable of reducing, or even eliminating, the most common mistakes that usually lead to safety risks. That is one of the main reasons why new technologies and tools have been introduced in the field. The one we will focus on is the so-called BIM: Building Information Modeling. With the term BIM we refer to wider and more complex analysis tool than a simple 3D modeling software. Through BIM technologies we are able to generate a multi-dimension 3D model which contains all the information about the project. This innovative approach aims at a better understanding and control of the project by taking into consideration the entire life cycle and resulting in a faster and more sustainable way of management. Furthermore, BIM software allows for the sharing of all the information among the different aspects of the project and among the different participants involved thus improving the cooperation and communication. In addition, BIM software utilizes smart tools that simulate and visualize the process in advance, thus preventing issues that might not have been taking into consideration during the design process. This leads to higher chances of avoiding risks, delays and cost increases. Using a hospital case study, we will apply this approach for the completion of a safety plan, with a special focus onto the construction phase.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In any terminological study, candidate term extraction is a very time-consuming task. Corpus analysis tools have automatized some processes allowing the detection of relevant data within the texts, facilitating term candidate selection as well. Nevertheless, these tools are (normally) not specific for terminology research; therefore, the units which are automatically extracted need manual evaluation. Over the last few years some software products have been specifically developed for automatic term extraction. They are based on corpus analysis, but use linguistic and statistical information to filter data more precisely. As a result, the time needed for manual evaluation is reduced. In this framework, we tried to understand if and how these new tools can really be an advantage. In order to develop our project, we simulated a terminology study: we chose a domain (i.e. legal framework for medicinal products for human use) and compiled a corpus from which we extracted terms and phraseologisms using AntConc, a corpus analysis tool. Afterwards, we compared our list with the lists extracted automatically from three different tools (TermoStat Web, TaaS e Sketch Engine) in order to evaluate their performance. In the first chapter we describe some principles relating to terminology and phraseology in language for special purposes and show the advantages offered by corpus linguistics. In the second chapter we illustrate some of the main concepts of the domain selected, as well as some of the main features of legal texts. In the third chapter we describe automatic term extraction and the main criteria to evaluate it; moreover, we introduce the term-extraction tools used for this project. In the fourth chapter we describe our research method and, in the fifth chapter, we show our results and draw some preliminary conclusions on the performance and usefulness of term-extraction tools.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the Dominican Republic economic growth in the past twenty years has not yielded sufficient improvement in access to drinking water services, especially in rural areas where 1.5 million people do not have access to an improved water source (WHO, 2006). Worldwide, strategic development planning in the rural water sector has focused on participatory processes and the use of demand filters to ensure that service levels match community commitment to post-project operation and maintenance. However studies have concluded that an alarmingly high percentage of drinking water systems (20-50%) do not provide service at the design levels and/or fail altogether (up to 90%): BNWP (2009), Annis (2006), and Reents (2003). World Bank, USAID, NGOs, and private consultants have invested significant resources in an effort to determine what components make up an “enabling environment” for sustainable community management of rural water systems (RWS). Research has identified an array of critical factors, internal and external to the community, which affect long term sustainability of water services. Different frameworks have been proposed in order to better understand the linkages between individual factors and sustainability of service. This research proposes a Sustainability Analysis Tool to evaluate the sustainability of RWS, adapted from previous relevant work in the field to reflect the realities in the Dominican Republic. It can be used as a diagnostic tool for government entities and development organizations to characterize the needs of specific communities and identify weaknesses in existing training regimes or support mechanisms. The framework utilizes eight indicators in three categories (Organization/Management, Financial Administration, and Technical Service). Nineteen independent variables are measured resulting in a score of sustainability likely (SL), possible (SP), or unlikely (SU) for each of the eight indicators. Thresholds are based upon benchmarks from the DR and around the world, primary data collected during the research, and the author’s 32 months of field experience. A final sustainability score is calculated using weighting factors for each indicator, derived from Lockwood (2003). The framework was tested using a statistically representative geographically stratified random sample of 61 water systems built in the DR by initiatives of the National Institute of Potable Water (INAPA) and Peace Corps. The results concluded that 23% of sample systems are likely to be sustainable in the long term, 59% are possibly sustainable, and for 18% it is unlikely that the community will be able to overcome any significant challenge. Communities that were scored as unlikely sustainable perform poorly in participation, financial durability, and governance while the highest scores were for system function and repair service. The Sustainability Analysis Tool results are verified by INAPA and PC reports, evaluations, and database information, as well as, field observations and primary data collected during the surveys. Future research will analyze the nature and magnitude of relationships between key factors and the sustainability score defined by the tool. Factors include: gender participation, legal status of water committees, plumber/operator remuneration, demand responsiveness, post construction support methodologies, and project design criteria.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study develops an automated analysis tool by combining total internal reflection fluorescence microscopy (TIRFM), an evanescent wave microscopic imaging technique to capture time-sequential images and the corresponding image processing Matlab code to identify movements of single individual particles. The developed code will enable us to examine two dimensional hindered tangential Brownian motion of nanoparticles with a sub-pixel resolution (nanoscale). The measured mean square displacements of nanoparticles are compared with theoretical predictions to estimate particle diameters and fluid viscosity using a nonlinear regression technique. These estimated values will be confirmed by the diameters and viscosities given by manufacturers to validate this analysis tool. Nano-particles used in these experiments are yellow-green polystyrene fluorescent nanospheres (200 nm, 500 nm and 1000 nm in diameter (nominal); 505 nm excitation and 515 nm emission wavelengths). Solutions used in this experiment are de-ionized (DI) water, 10% d-glucose and 10% glycerol. Mean square displacements obtained near the surface shows significant deviation from theoretical predictions which are attributed to DLVO forces in the region but it conforms to theoretical predictions after ~125 nm onwards. The proposed automation analysis tool will be powerfully employed in the bio-application fields needed for examination of single protein (DNA and/or vesicle) tracking, drug delivery, and cyto-toxicity unlike the traditional measurement techniques that require fixing the cells. Furthermore, this tool can be also usefully applied for the microfluidic areas of non-invasive thermometry, particle tracking velocimetry (PTV), and non-invasive viscometry.