1000 resultados para neo-kohlberguiana approaching based on DIT


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Renewable energy sources (RES) have unique characteristics that grant them preference in energy and environmental policies. However, considering that the renewable resources are barely controllable and sometimes unpredictable, some challenges are faced when integrating high shares of renewable sources in power systems. In order to mitigate this problem, this paper presents a decision-making methodology regarding renewable investments. The model computes the optimal renewable generation mix from different available technologies (hydro, wind and photovoltaic) that integrates a given share of renewable sources, minimizing residual demand variability, therefore stabilizing the thermal power generation. The model also includes a spatial optimization of wind farms in order to identify the best distribution of wind capacity. This methodology is applied to the Portuguese power system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technology plays a double role in Education: it can act as a facilitator in the teaching/learning process and it can be the very subject of that process in Science & Engineering courses. This is especially true when students perform laboratory activities where they interact with equipment and objects under experimentation. In this context, technology can also play a facilitator role if it allows students to perform experiments in a remote fashion, through the Internet, in a so-called weblab or remote laboratory. No doubt, the Internet has been revolutionizing the educational process in many aspects, and it can be stated that remote laboratories are just an angle of that on-going revolution. As any other educational tool or resource, the i) pedagogical approach and the ii) technology used in the development of a remote laboratory can dictate its general success or its ephemeral existence. By pedagogical approach we consider the way remote experiments address the process by which students acquire experimental skills and link experimental results to theoretical concepts. In respect to technology, we discuss different specification and implementation alternatives, to show the case where the adoption of a family of standards would positively contribute to a larger acceptance and utilization of remote laboratories, and also to a wider collaboration in their development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the pedagogical community, Serious Games have arisen as a viable alternative to traditional course-based learning materials. Until now, they have been based strictly on software solutions. Meanwhile, research into Remote Laboratories has shown that they are a viable, low-cost solution for experimentation in an engineering context, providing uninterrupted access, low-maintenance requirements, and a heightened sense of reality when compared to simulations. This paper will propose a solution where both approaches are combined to deliver a Remote Laboratory-based Serious Game for use in engineering and school education. The platform for this system is the WebLab-Deusto Framework, already well-tested within the remote laboratory context, and based on open standards. The laboratory allows users to control a mobile robot in a labyrinth environment and take part in an interactive game where they must locate and correctly answer several questions, the subject of which can be adapted to educators' needs. It also integrates the Google Blockly graphical programming language, allowing students to learn basic programming and logic principles without needing to understand complex syntax.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação de Mestrado em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A good verification strategy should bring near the simulation and real functioning environments. In this paper we describe a system-level co-verification strategy that uses a common flow for functional simulation, timing simulation and functional debug. This last step requires using a BST infrastructure, now widely available on commercial devices, specially on FPGAs with medium/large pin-counts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we present results about the functioning of a multilayered a-SiC:H heterostructure as a device for wavelength-division demultiplexing of optical signals. The device is composed of two stacked p-i-n photodiodes, both optimized for the selective collection of photogenerated carriers. Band gap engineering was used to adjust the photogeneration and recombination rates profiles of the intrinsic absorber regions of each photodiode to short and long wavelength absorption and carrier collection in the visible spectrum. The photocurrent signal using different input optical channels was analyzed at reverse and forward bias and under steady state illumination. This photocurrent is used as an input for a demux algorithm based on the voltage controlled sensitivity of the device. The device functioning is explained with results obtained by numerical simulation of the device, which permit an insight to the internal electric configuration of the double heterojunction.These results address the explanation of the device functioning in the frequency domain to a wavelength tunable photocapacitance due to the accumulation of space charge localized at the internal junction. The existence of a direct relation between the experimentally observed capacitive effects of the double diode and the quality of the semiconductor materials used to form the internal junction is highlighted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a new unsupervised hyperspectral unmixing method conceived to linear but highly mixed hyperspectral data sets, in which the simplex of minimum volume, usually estimated by the purely geometrically based algorithms, is far way from the true simplex associated with the endmembers. The proposed method, an extension of our previous studies, resorts to the statistical framework. The abundance fraction prior is a mixture of Dirichlet densities, thus automatically enforcing the constraints on the abundance fractions imposed by the acquisition process, namely, nonnegativity and sum-to-one. A cyclic minimization algorithm is developed where the following are observed: 1) The number of Dirichlet modes is inferred based on the minimum description length principle; 2) a generalized expectation maximization algorithm is derived to infer the model parameters; and 3) a sequence of augmented Lagrangian-based optimizations is used to compute the signatures of the endmembers. Experiments on simulated and real data are presented to show the effectiveness of the proposed algorithm in unmixing problems beyond the reach of the geometrically based state-of-the-art competitors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de Mestrado em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IEEE Electron Device Letters, VOL. 29, NO. 9,

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, significant research in the field of electrochemistry was developed. The performance of electrical devices, depending on the processes of the electrolytes, was described and the physical origin of each parameter was established. However, the influence of the irregularity of the electrodes was not a subject of study and only recently this problem became relevant in the viewpoint of fractional calculus. This paper describes an electrolytic process in the perspective of fractional order capacitors. In this line of thought, are developed several experiments for measuring the electrical impedance of the devices. The results are analyzed through the frequency response, revealing capacitances of fractional order that can constitute an alternative to the classical integer order elements. Fractional order electric circuits are used to model and study the performance of the electrolyte processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To increase the amount of logic available to the users in SRAM-based FPGAs, manufacturers are using nanometric technologies to boost logic density and reduce costs, making its use more attractive. However, these technological improvements also make FPGAs particularly vulnerable to configuration memory bit-flips caused by power fluctuations, strong electromagnetic fields and radiation. This issue is particularly sensitive because of the increasing amount of configuration memory cells needed to define their functionality. A short survey of the most recent publications is presented to support the options assumed during the definition of a framework for implementing circuits immune to bit-flips induction mechanisms in memory cells, based on a customized redundant infrastructure and on a detection-and-fix controller.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The new generations of SRAM-based FPGA (field programmable gate array) devices are the preferred choice for the implementation of reconfigurable computing platforms intended to accelerate processing in real-time systems. However, FPGA's vulnerability to hard and soft errors is a major weakness to robust configurable system design. In this paper, a novel built-in self-healing (BISH) methodology, based on run-time self-reconfiguration, is proposed. A soft microprocessor core implemented in the FPGA is responsible for the management and execution of all the BISH procedures. Fault detection and diagnosis is followed by repairing actions, taking advantage of the dynamic reconfiguration features offered by new FPGA families. Meanwhile, modular redundancy assures that the system still works correctly

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A clinical trial involving 80 patients of both sexes, from ages 15 to 55, with chronic intestinal or hepatointestinal schistosomiasis mansoni, was carried out to evaluate the therapeutical efficacy of different dose regimens of praziquantel. The patients were randomly allocated into four groups with an equal number of cases and were then treated with one of the following dosages: 60 mg/kg for 1 day; 60 mg/kg daily for 2 days; 60 mg/kg daily for 3 days; and 30 mg/kg daily for 6 days. The assessment of parasitological cure was based on the quantitative oogram technique through rectal mucosa biopsies which were undertaken prior to, as well as, 1,2,4 and 6 months post-treatment. Concurrently, stool examinations according to the qualitative Hoffman, Pons & Janer (HPJ) and the quantitative Kato-Katz (K-K) methods were also performed. The best tolerability was observed with 30 mg/kg daily for 6 days whereas the highest incidence of side-effects (mainly dizziness and nausea) was found with 60 mg/kg daily for 3 days. No serious adverse drug reaction has occurred. The achieved cure rates were: 25% with 60 mg/kg for 1 day; 60% with 60 mg/kg daily for 2 days; 89.5% with 60 mg/kg daily for 3 days; and 90% with 30 mg/kg daily for 6 days. At the same time there has been a downfall of 64%, 73%, 87% and 84% respectively, in the median number of viable S. mansoni ova per gram of tissue. Thus, a very clear direct correlation between dose and effect could be seen. The corresponding cure rates according to stool examinations by HPJ were 39%, 80%, 100% and 95%; by K-K 89%, 100%, 100% and 100%. This discrepancy in results amongst the three parasitological methods is certainly due to their unequal accuracy. In fact, when the number of viable eggs per gram of tissue fell below 5,000 the difference in the percentage of false negative findings between HPJ (28%) and K-K (80%) became significative. When this number dropped to less than 2,000 the percentage of false negative results obtained with HPJ (49%) turned significant in relation to the oogram as well. In conclusion, it has been proven that praziquantel is a highly efficacious agent against S. mansoni infections. If administered at a total dose of 180 mg/kg divided into either 3 or 6 days, it yields a 90% cure rate. Possibly, one could reach 100% by increasing the total dose to 240 mg/kg. Furthermore, it was confirmed that the quantitative oogram technique is the most reliable parasitological method when evaluating the efficacy of new drugs in schistosomiasis mansoni.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The content of a Learning Object is frequently characterized by metadata from several standards, such as LOM, SCORM and QTI. Specialized domains require new application profiles that further complicate the task of editing the metadata of learning object since their data models are not supported by existing authoring tools. To cope with this problem we designed a metadata editor supporting multiple metadata languages, each with its own data model. It is assumed that the supported languages have an XML binding and we use RDF to create a common metadata representation, independent from the syntax of each metadata languages. The combined data model supported by the editor is defined as an ontology. Thus, the process of extending the editor to support a new metadata language is twofold: firstly, the conversion from the XML binding of the metadata language to RDF and vice-versa; secondly, the extension of the ontology to cover the new metadata model. In this paper we describe the general architecture of the editor, we explain how a typical metadata language for learning objects is represented as an ontology, and how this formalization captures all the data required to generate the graphical user interface of the editor.