960 resultados para new methods
Resumo:
The evolution of commodity computing lead to the possibility of efficient usage of interconnected machines to solve computationally-intensive tasks, which were previously solvable only by using expensive supercomputers. This, however, required new methods for process scheduling and distribution, considering the network latency, communication cost, heterogeneous environments and distributed computing constraints. An efficient distribution of processes over such environments requires an adequate scheduling strategy, as the cost of inefficient process allocation is unacceptably high. Therefore, a knowledge and prediction of application behavior is essential to perform effective scheduling. In this paper, we overview the evolution of scheduling approaches, focusing on distributed environments. We also evaluate the current approaches for process behavior extraction and prediction, aiming at selecting an adequate technique for online prediction of application execution. Based on this evaluation, we propose a novel model for application behavior prediction, considering chaotic properties of such behavior and the automatic detection of critical execution points. The proposed model is applied and evaluated for process scheduling in cluster and grid computing environments. The obtained results demonstrate that prediction of the process behavior is essential for efficient scheduling in large-scale and heterogeneous distributed environments, outperforming conventional scheduling policies by a factor of 10, and even more in some cases. Furthermore, the proposed approach proves to be efficient for online predictions due to its low computational cost and good precision. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Fixation of CO(2) is one of the most important priorities of the scientific community dedicated to reduce global warming. In this work, we propose new methods for the fixation of CO2 using the guanidine bases tetramethylguanidine (TMG) and 1,3,4,6,7,8-hexahydro-2H-pyrimido[1,2-a]-pyrimidine (TBD). In order to understand the reactions occurring during the CO(2) fixation and release processes, we employed several experimental methods, including solution and solid-state NMR, FTIR, and coupled TGA-FTIR. Quantum mechanical NMR calculations were also carried out. Based on the results obtained, we concluded that CO(2) fixation with both TMG and TBD guanidines is a kinetically reversible process, and the corresponding fixation products have proved to be useful as transcarboxylating compounds. Afterward, CO(2) thermal releasing from this fixation product with TBD was found to be an interesting process for CO(2) capture and isolation purposes. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Nowadays, noninvasive methods of diagnosis have increased due to demands of the population that requires fast, simple and painless exams. These methods have become possible because of the growth of technology that provides the necessary means of collecting and processing signals. New methods of analysis have been developed to understand the complexity of voice signals, such as nonlinear dynamics aiming at the exploration of voice signals dynamic nature. The purpose of this paper is to characterize healthy and pathological voice signals with the aid of relative entropy measures. Phase space reconstruction technique is also used as a way to select interesting regions of the signals. Three groups of samples were used, one from healthy individuals and the other two from people with nodule in the vocal fold and Reinke`s edema. All of them are recordings of sustained vowel /a/ from Brazilian Portuguese. The paper shows that nonlinear dynamical methods seem to be a suitable technique for voice signal analysis, due to the chaotic component of the human voice. Relative entropy is well suited due to its sensibility to uncertainties, since the pathologies are characterized by an increase in the signal complexity and unpredictability. The results showed that the pathological groups had higher entropy values in accordance with other vocal acoustic parameters presented. This suggests that these techniques may improve and complement the recent voice analysis methods available for clinicians. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
The aim of this qualitative respondent investigation is to delve into the various views that teachers have concerning the “One-to-One project”, as well as the use of computers as an aid in teaching. One-to-One means that teachers and students will be equipped with a laptop they can use at home and at school.This essay looks at how several factors have changed as a result of this. These factors are threefold: the role of the teacher, the teaching experience, and the student´s learning process. In order to answer the mentioned questions, four interviews have been conducted at two different high schools in southern Norrland. The theory used is the socio-cultural perspective. One result has been that computers can simplify teaching in various ways. Students have faster access to information, and there exists a platform for further communication between the teacher and student outside the classroom. However, there are also several negative aspects. One of these is that the students spend time doing non-school related activities, such as interacting using social mediums. Results also show that the role of the teacher has due to the "One-to-One project" gone from being structural to being interactional. The conclusions reached by the investigation are that today’s schools are experiencing a paradigm shift. Old teaching methods are being replaced by new methods and an altered teaching practice has developed as a result of the presence of the computer in the classroom.
Resumo:
The purpose of this study is to investigate the most common phonetic and phonological difficulties in the teaching of Spanish as a foreign language. The study has been based on the following questions: Which difficulties can teachers encounter when teaching phonetics and phonology? Which difficulties can students encounter when learning phonetics and phonology? How is phonetics and phonology taught? In order to be able to investigate the difficulties, a questionnaire has been handed out to five experienced teachers. The results of the questionnaires, together with the theory, has been analysed in the analysis. The outcome of the analysis shows that several difficulties can be detected in both the teaching and in the learning process. The results of the questionnaires also show us that the teachers mostly teach phonetics the same way: through repetition and imitation, the conductive method, and very few think outside of the box to encounter new methods.
Resumo:
Literacy is an invaluable asset to have, and has allowed for communication, documentation and the spreading of ideas since the beginning of the written language. With technological advancements, and new possibilities to communicate, it is important to question the degree to which people’s abilities to utilise these new methods have developed in relation to these emerging technologies. The purpose of this bachelor’s thesis is to analyse the state of students’ at Dalarna University mulitimodal literacy, as well as their experience of multimodality in their education. This has led to the two main research questions: What is the state of the students at Dalarna University multimodal literacy? And: How have the students at Dalarna University experienced multimodality in education? The paper is based on a mixed-method study that incorporates both a quantitative and qualitative aspect to it. The main thrust of the research paper is, however, based on a quantitative study that was conducted online and emailed to students via their program coordinators. The scope of the research is in audio-visual modes, i.e. audio, video and images, while textual literacy is presumed and serves as an inspiration to the study. The purpose of the study is to analyse the state of the students’ multimodal literacy and their experience of multimodality in education. The study revealed that the students at Dalarna University have most skill in image editing, while not being very literate in audio or video editing. The students seem to have had mediocre experience creating meaning through multimodality both in private use and in their respective educational institutions. The study also reveals that students prefer learning by means of video (rather than text or audio), yet are not able to create meaning (communicate) through it.
Resumo:
This paper aims to present three new methods for color detection and segmentation of road signs. The images are taken by a digital camera mounted in a car. The RGB images are converted into IHLS color space, and new methods are applied to extract the colors of the road signs under consideration. The methods are tested on hundreds of outdoor images in different light conditions, and they show high robustness. This project is part of the research taking place in Dalarna University / Sweden in the field of the ITS.
Resumo:
During the 1980¿s and the 1990¿s, the Brazilian federal government started to set up a new public administration policy called ¿managerial¿ conceived of new patterns of efficiency and effectiveness and extremelly concerned about optimizing state administration to grant best results for people. This decision has been taken due to three main reasons; (i) the worst fiscal crisis considering the last decades; (ii) exhaustion on interfering with Brazilian economy due to its opening to globalization, and (iii) extremelly deep-rooted burocratic methods. The Brazilian state reform presented as a diagnosis of the human resource government area: (i) gradual raising costs in payroll, allied to (ii) huge raising inefficiency in public services, and (iii) civil servants are unprepared to improve better responses to currents citizen demands and to adopt new methods of management based on the best professional performance and the best quality of public services. We have concluded that the federal government often tries to make civil servants redundant instead of adopting a real policy of management that would give them better conditions to improve their performance. This paper presents a concrete proposal to improve quality in civil servants performance by taking advantage of information technology and of our assumed country¿s democratization. We suggest that the Brazilian state reform must be and should be a new path of social growth and development not only in economic basis.
Resumo:
As organizações - reproduzindo e aperfeiçoando a fónnula de substituição do trabalho do século XIX - estão investindo em novos modelos de gestão e tecnologias modernas, com o objetivo de reduzir seus custos, aperfeiçoar os seus produtos e agilizar os seus serviços. Sentindo a necessidade de se inserir nesse contexto~ o Estado, assim entendido como um todo, e, especialmente, o Poder Judiciário do Estado do Espírito Santo, tem buscado adaptar-se a tais mudanças, visto que o seu modelo administrativo e jurisdicional há muito tempo encontra-se ultrapassado, necessitando de novas abordagens. Entretanto, muitas têm sido as resistências à modernização administrativa e tecnológica, tanto por parte de uma corrente de juristas tradicionalistas, que, constantemente promovem um movimento de retomo a padrões e conceitos antigos, quanto pela escassez de recursos financeiros . A inovação envolve sempre o elemento de incerteza e, embora a história da humanidade detenha, durante a sua trajetória, registros de inovações em modelos de gestão e significativos avanços tecnológicos, atualmente eles vêm exigindo um maior percentual de conhecimentos diversificados, quebrando paradigmas que influenciam no comportamentodas organizações/instituições, num processo irreversível de crescimento dinâmico. A relevância desta pesquisa residiu em analisar o modelo de gestão e a inovação tecnológica do Poder Judiciário do Estado do Espírito Santo, a partir das transformações mundiais, além da sua importância nos mecanismos jurisdicionais, seja instantaneamente ou após um período de aprendizado combinado com a aceitação da inovação. Pretendeu-se, ainda, conhecer os efeitos dessas mudanças na cultura da instituição, tais como possíveis causas de resistência.
Resumo:
The Internet recruiting has been announced in the job market as a modern tool to attract the best and brightest professionals to the companies. The main objective of the present study is to analyze the Internet resources applied to the Recruitment and Selection process, so as to understand how can these tools make easier recruitment, in which concerns to cost efficiency, time spent and adequacy of the candidates selected to fill the job vacancies. It is also studied the role of the intermediates in the job market, specifically the recruitment consultants for executive search, considering the intensive use of the Internet tools for companies that, in this new way, get in touch directly with a major group of possible candidates. It is also investigated how these new resources to develop in-house capabilities to manage on-line recruiting, will bring savings, better process times and superior qualities of candidates. The study has an empirical section based on a case study of the Companhia Distribuidora de Gás do Rio de Janeiro - CEG, in which the new method with the Internet tools is compared vis-à-vis the traditional one. The study analyses the new method¿s impact on the main variables present in the process.Keywords: Labor Market, Recruitment and Selection, On-line Recruitment, Human Resource Management, Internet
Resumo:
The evolution of integrated circuits technologies demands the development of new CAD tools. The traditional development of digital circuits at physical level is based in library of cells. These libraries of cells offer certain predictability of the electrical behavior of the design due to the previous characterization of the cells. Besides, different versions of each cell are required in such a way that delay and power consumption characteristics are taken into account, increasing the number of cells in a library. The automatic full custom layout generation is an alternative each time more important to cell based generation approaches. This strategy implements transistors and connections according patterns defined by algorithms. So, it is possible to implement any logic function avoiding the limitations of the library of cells. Tools of analysis and estimate must offer the predictability in automatic full custom layouts. These tools must be able to work with layout estimates and to generate information related to delay, power consumption and area occupation. This work includes the research of new methods of physical synthesis and the implementation of an automatic layout generation in which the cells are generated at the moment of the layout synthesis. The research investigates different strategies of elements disposition (transistors, contacts and connections) in a layout and their effects in the area occupation and circuit delay. The presented layout strategy applies delay optimization by the integration with a gate sizing technique. This is performed in such a way the folding method allows individual discrete sizing to transistors. The main characteristics of the proposed strategy are: power supply lines between rows, over the layout routing (channel routing is not used), circuit routing performed before layout generation and layout generation targeting delay reduction by the application of the sizing technique. The possibility to implement any logic function, without restrictions imposed by a library of cells, allows the circuit synthesis with optimization in the number of the transistors. This reduction in the number of transistors decreases the delay and power consumption, mainly the static power consumption in submicrometer circuits. Comparisons between the proposed strategy and other well-known methods are presented in such a way the proposed method is validated.
Resumo:
This paper considers the general problem of Feasible Generalized Least Squares Instrumental Variables (FG LS IV) estimation using optimal instruments. First we summarize the sufficient conditions for the FG LS IV estimator to be asymptotic ally equivalent to an optimal G LS IV estimator. Then we specialize to stationary dynamic systems with stationary VAR errors, and use the sufficient conditions to derive new moment conditions for these models. These moment conditions produce useful IVs from the lagged endogenous variables, despite the correlation between errors and endogenous variables. This use of the information contained in the lagged endogenous variables expands the class of IV estimators under consideration and there by potentially improves both asymptotic and small-sample efficiency of the optimal IV estimator in the class. Some Monte Carlo experiments compare the new methods with those of Hatanaka [1976]. For the DG P used in the Monte Carlo experiments, asymptotic efficiency is strictly improved by the new IVs, and experimental small-sample efficiency is improved as well.
Resumo:
A formação de emulsão de água-em-óleo gera um significativo incremento na viscosidade, o que afeta diretamente a produção do poço, pois aumenta a perda de carga ao longo da linha de produção, dificultando o escoamento e diminuindo a produção de óleo. A presença e natureza da emulsão, e seu impacto na reologia do petróleo, podem determinar a viabilidade econômica e técnica dos processos envolvidos. A medida que a fração de água aumenta e a temperatura é reduzida, o comportamento das emulsões se torna cada vez mais não-Newtoniano. A decorrência disso, é que a temperatura e a taxa de cisalhamento passam a ter maior impacto na variação da viscosidade das emulsões. Nesse estudo são propostos novos métodos que levam em conta essas variáveis. Os dados reológicos experimentais de 15 petróleos leves foram utilizados para avaliar o desempenho dos modelos existentes na literatura e compará-los com os novos métodos propostos nesse estudo.
Resumo:
ARAÚJO, B. G. ; VALENTIM, R. A. M. . Publicidade em celulares utilizando o sistema de busca de perfil. Holos, Natal,v. 1, p. 109-118, 2010. Disponível em:
Resumo:
The world has many types of oil that have a range of values of density and viscosity, these are characteristics to identify whether an oil is light, heavy or even ultraheavy. The occurrence of heavy oil has increased significantly and pointing to a need for greater investment in the exploitation of deposits and therefore new methods to recover that oil. There are economic forecasts that by 2025, the heavy oil will be the main source of fossil energy in the world. One such method is the use of solvent vaporized VAPEX which is known as a recovery method which consists of two horizontal wells parallel to each other, with a gun and another producer, which uses as an injection solvent that is vaporized in order to reduce the viscosity of oil or bitumen, facilitating the flow to the producing well. This method was proposed by Dr. Roger Butler, in 1991. The importance of this study is to analyze how the influence some operational reservoir and parameters are important in the process VAPEX, such as accumulation of oil produced in the recovery factor in flow injection and production rate. Parameters such as flow injection, spacing between wells, type of solvent to be injected, vertical permeability and oil viscosity were addressed in this study. The results showed that the oil viscosity is the parameter that showed statistically significant influence, then the choice of Heptane solvent to be injected showed a greater recovery of oil compared to other solvents chosen, considering the spacing between the wells was shown that for a greater distance between the wells to produce more oil