976 resultados para Iterative methods (mathematics)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Com as várias reformulações do processo de avaliação de desempenho docente (ADD), passou-se de um processo de avaliação dos professores assente na autoavaliação do docente, para um sistema de heteroavaliação que visa a prestação de contas e a seleção, mas também a melhoria de práticas e o desenvolvimento profissional, através do acompanhamento supervisivo. No entanto, a conciliação entre estas duas perspetivas não é fácil e tem gerado perplexidades e inquietações entre os avaliadores e entre os avaliados. Com este estudo, pretendemos conhecer a perspetiva de avaliadores e avaliados sobre o papel que a Supervisão detém no processo de avaliação de desempenho do professor e o seu contributo para o desenvolvimento profissional do professor e melhoria das práticas. Assim desenhou-se um plano de estudo qualitativo, exploratório e descritivo procurando captar o ponto de vista dos principais intervenientes no processo. O principal meio de recolha de dados foi a entrevista semiestruturada a 4 professores avaliadores e 4 professores avaliados do Departamento de Matemática e Ciências Experimentais de um Agrupamento de Escolas. Os resultados mostram que os docentes discordam do modelo de ADD, mas que o processo vivenciado teve caraterísticas positivas, dando os avaliadores especial relevância à oportunidade de conhecer outros processos de trabalho, a partir da observação de aulas. Avaliadores e avaliados referem ainda o bom relacionamento estabelecido entre ambos, contradizendo as preocupações expressas quando se pronunciaram sobre os problemas do modelo de avaliação. No entanto, ambos os subgrupos focam a falta de formação e qualificação profissional dos avaliadores. Em termos gerais, é possível concluir que a maior parte das dificuldades atribuídas pelos docentes ao modelo de ADD não foi depois sentida na implementação prática. Por outro lado, na opinião dos docentes, o processo experienciado não teve verdadeiro impacto na melhoria das práticas ou no desenvolvimento profissional dos docentes envolvidos. - Abstract The several reformulations of the process of the teacher performance evaluation made the process of evaluating teachers, which was first based on a self-evaluation change into a straight assessment system, which aims not only accountability and selection, but also the improvement of the practice and professional development through the supervision of the teaching practice. However, the conciliation between these perspectives has not been easy and has generated anxieties and perplexities among evaluators and evaluated. With this study we intend to know the perspective of the evaluators and of the evaluated concerning the role supervision has hold in the process of evaluating a teacher’s performance and its contribution to the professional development of the teacher and to the improvement of his practices. Thus, a qualitative, exploratory and descriptive study plan was drawn to get to know the viewpoint of the intervening ones in the process. The main means for data collection was based on a semi structured interview, targeting 4 teacher evaluators and 4 evaluated teachers of the Experimental Sciences and Mathematics Department of a group of Schools. The results show that teachers disagree with the teacher’s evaluation model, but the experienced process had positive characteristics. Besides, evaluators have given special relevance to the opportunity of getting to know other working methods thanks to the lessons’ observations. Evaluators and evaluated also refer the good relationship between both parts contradicting the fears they had when they first pronounced themselves about this evaluation model. However, both subgroups focus the lack of training and professional qualifications of the evaluators. So it’s possible to conclude that most worries mentioned by the teachers when referring to this model of teacher performance evaluation were not verified during its practical implementation. On the other hand, in the opinion of the involved teachers, the experienced process had no real impact on their practice improvement or on their professional development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Deoxyribonucleic acid, or DNA, is the most fundamental aspect of life but present day scientific knowledge has merely scratched the surface of the problem posed by its decoding. While experimental methods provide insightful clues, the adoption of analysis tools supported by the formalism of mathematics will lead to a systematic and solid build-up of knowledge. This paper studies human DNA from the perspective of system dynamics. By associating entropy and the Fourier transform, several global properties of the code are revealed. The fractional order characteristics emerge as a natural consequence of the information content. These properties constitute a small piece of scientific knowledge that will support further efforts towards the final aim of establishing a comprehensive theory of the phenomena involved in life.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Personal memories composed of digital pictures are very popular at the moment. To retrieve these media items annotation is required. During the last years, several approaches have been proposed in order to overcome the image annotation problem. This paper presents our proposals to address this problem. Automatic and semi-automatic learning methods for semantic concepts are presented. The automatic method is based on semantic concepts estimated using visual content, context metadata and audio information. The semi-automatic method is based on results provided by a computer game. The paper describes our proposals and presents their evaluations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research paper presents five different clustering methods to identify typical load profiles of medium voltage (MV) electricity consumers. These methods are intended to be used in a smart grid environment to extract useful knowledge about customer’s behaviour. The obtained knowledge can be used to support a decision tool, not only for utilities but also for consumers. Load profiles can be used by the utilities to identify the aspects that cause system load peaks and enable the development of specific contracts with their customers. The framework presented throughout the paper consists in several steps, namely the pre-processing data phase, clustering algorithms application and the evaluation of the quality of the partition, which is supported by cluster validity indices. The process ends with the analysis of the discovered knowledge. To validate the proposed framework, a case study with a real database of 208 MV consumers is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação apresentada à Escola Superior de Educação de Lisboa no âmbito do Mestrado em Ensino Especial

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of electricity markets, transmission pricing is an important tool to achieve an efficient operation of the electricity system. The electricity market is influenced by several factors; however the transmission network management is one of the most important aspects, because the network is a natural monopoly. The transmission tariffs can help to regulate the market, for this reason transmission tariffs must follow strict criteria. This paper presents the following methods to tariff the use of transmission networks by electricity market players: Post-Stamp Method; MW-Mile Method Distribution Factors Methods; Tracing Methodology; Bialek’s Tracing Method and Locational Marginal Price. A nine bus transmission network is used to illustrate the application of the tariff methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes two meta-heuristics (Genetic Algorithm and Evolutionary Particle Swarm Optimization) for solving a 15 bid-based case of Ancillary Services Dispatch in an Electricity Market. A Linear Programming approach is also included for comparison purposes. A test case based on the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is used to demonstrate that the use of meta-heuristics is suitable for solving this kind of optimization problem. Faster execution times and lower computational resources requirements are the most relevant advantages of the used meta-heuristics when compared with the Linear Programming approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tool must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case based on California Independent System Operator (CAISO) data concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives : The purpose of this article is to find out differences between surveys using paper and online questionnaires. The author has deep knowledge in the case of questions concerning opinions in the development of survey based research, e.g. the limits of postal and online questionnaires. Methods : In the physician studies carried out in 1995 (doctors graduated in 1982-1991), 2000 (doctors graduated in 1982-1996), 2005 (doctors graduated in 1982-2001), 2011 (doctors graduated in 1977-2006) and 457 family doctors in 2000, were used paper and online questionnaires. The response rates were 64%, 68%, 64%, 49% and 73%, respectively. Results : The results of the physician studies showed that there were differences between methods. These differences were connected with using paper-based questionnaire and online questionnaire and response rate. The online-based survey gave a lower response rate than the postal survey. The major advantages of online survey were short response time; very low financial resource needs and data were directly loaded in the data analysis software, thus saved time and resources associated with the data entry process. Conclusions : The current article helps researchers with planning the study design and choosing of the right data collection method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tomographic image can be degraded, partially by patient based attenuation. The aim of this paper is to quantitatively verify the effects of attenuation correction methods Chang and CT in 111In studies through the analysis of profiles from abdominal SPECT, correspondent to a uniform radionuclide uptake organ, the left kidney.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to present an adaptation model for an Adaptive Educational Hypermedia System, PCMAT. The adaptation of the application is based on progressive self-assessment (exercises, tasks, and so on) and applies the constructivist learning theory and the learning styles theory. Our objective is the creation of a better, more adequate adaptation model that takes into account the complexities of different users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work we solve Mathematical Programs with Complementarity Constraints using the hyperbolic smoothing strategy. Under this approach, the complementarity condition is relaxed through the use of the hyperbolic smoothing function, involving a positive parameter that can be decreased to zero. An iterative algorithm is implemented in MATLAB language and a set of AMPL problems from MacMPEC database were tested.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a microwave-assisted extraction (MAE) methodology was compared with several conventional extraction methods (Soxhlet, Bligh & Dyer, modified Bligh & Dyer, Folch, modified Folch, Hara & Radin, Roese-Gottlieb) for quantification of total lipid content of three fish species: horse mackerel (Trachurus trachurus), chub mackerel (Scomber japonicus), and sardine (Sardina pilchardus). The influence of species, extraction method and frozen storage time (varying from fresh to 9 months of freezing) on total lipid content was analysed in detail. The efficiencies of methods MAE, Bligh & Dyer, Folch, modified Folch and Hara & Radin were the highest and although they were not statistically different, differences existed in terms of variability, with MAE showing the highest repeatability (CV = 0.034). Roese-Gottlieb, Soxhlet, and modified Bligh & Dyer methods were very poor in terms of efficiency as well as repeatability (CV between 0.13 and 0.18).