980 resultados para Task-Oriented Methodology
Resumo:
Important research effort has been devoted to the topic of optimal planning of distribution systems. The non linear nature of the system, the need to consider a large number of scenarios and the increasing necessity to deal with uncertainties make optimal planning in distribution systems a difficult task. Heuristic techniques approaches have been proposed to deal with these issues, overcoming some of the inherent difficulties of classic methodologies. This paper considers several methodologies used to address planning problems of electrical power distribution networks, namely mixedinteger linear programming (MILP), ant colony algorithms (AC), genetic algorithms (GA), tabu search (TS), branch exchange (BE), simulated annealing (SA) and the Bender´s decomposition deterministic non-linear optimization technique (BD). Adequacy of theses techniques to deal with uncertainties is discussed. The behaviour of each optimization technique is compared from the point of view of the obtained solution and of the methodology performance. The paper presents results of the application of these optimization techniques to a real case of a 10-kV electrical distribution system with 201 nodes that feeds an urban area.
Resumo:
A methodology based on data mining techniques to support the analysis of zonal prices in real transmission networks is proposed in this paper. The mentioned methodology uses clustering algorithms to group the buses in typical classes that include a set of buses with similar LMP values. Two different clustering algorithms have been used to determine the LMP clusters: the two-step and K-means algorithms. In order to evaluate the quality of the partition as well as the best performance algorithm adequacy measurements indices are used. The paper includes a case study using a Locational Marginal Prices (LMP) data base from the California ISO (CAISO) in order to identify zonal prices.
Resumo:
Dust is a complex mixture of particles of organic and inorganic origin and different gases absorbed in aerosol droplets. In a poultry unit include dried faecal matter and urine, skin flakes, ammonia, carbon dioxide, pollens, feed and litter particles, feathers, grain mites, fungi spores, bacteria, viruses and their constituents. Dust particles vary in size and differentiation between particle size fractions is important in health studies in order to quantify penetration within the respiratory system. A descriptive study was developed in order to assess exposure to particles in a poultry unit during different operations, namely routine examination and floor turn over. Direct-reading equipment was used (Lighthouse, model 3016 IAQ). Particle measurement was performed in 5 different sizes (PM0.5; PM1.0; PM2.5; PM5.0; PM10). The chemical composition of poultry litter was also determined by neutron activation analysis. Normally, the litter of poultry pavilions is turned over weekly and it was during this operation that the higher exposure of particles was observed. In all the tasks considered PM5.0 and PM10.0 were the sizes with higher concentrations values. PM10 is what turns out to have higher values and PM0.5 the lowest values. The chemical element with the highest concentration was Mg (5.7E6 mg.kg-1), followed by K (1.5E4 mg.kg-1), Ca (4.8E3 mg.kg-1), Na (1.7E3 mg.kg-1), Fe (2.1E2 mg.kg-1) and Zn (4.2E1 mg.kg-1). This high presence of particles in the respirable range (<5–7μm) means that poultry dust particles can penetrate into the gas exchange region of the lung. Larger particles (PM10) present a range of concentrations from 5.3E5 and 3.0E6 mg/m3.
Resumo:
The management of energy resources for islanded operation is of crucial importance for the successful use of renewable energy sources. A Virtual Power Producer (VPP) can optimally operate the resources taking into account the maintenance, operation and load control considering all the involved cost. This paper presents the methodology approach to formulate and solve the problem of determining the optimal resource allocation applied to a real case study in Budapest Tech’s. The problem is formulated as a mixed-integer linear programming model (MILP) and solved by a deterministic optimization technique CPLEX-based implemented in General Algebraic Modeling Systems (GAMS). The problem has also been solved by Evolutionary Particle Swarm Optimization (EPSO). The obtained results are presented and compared.
Resumo:
This paper presents a new and efficient methodology for distribution network reconfiguration integrated with optimal power flow (OPF) based on a Benders decomposition approach. The objective minimizes power losses, balancing load among feeders and subject to constraints: capacity limit of branches, minimum and maximum power limits of substations or distributed generators, minimum deviation of bus voltages and radial optimal operation of networks. The Generalized Benders decomposition algorithm is applied to solve the problem. The formulation can be embedded under two stages; the first one is the Master problem and is formulated as a mixed integer non-linear programming problem. This stage determines the radial topology of the distribution network. The second stage is the Slave problem and is formulated as a non-linear programming problem. This stage is used to determine the feasibility of the Master problem solution by means of an OPF and provides information to formulate the linear Benders cuts that connect both problems. The model is programmed in GAMS. The effectiveness of the proposal is demonstrated through two examples extracted from the literature.
Resumo:
OBJECTIVE: To examine the effects of the length and timing of nighttime naps on performance and physiological functions, an experimental study was carried out under simulated night shift schedules. METHODS: Six students were recruited for this study that was composed of 5 experiments. Each experiment involved 3 consecutive days with one night shift (22:00-8:00) followed by daytime sleep and night sleep. The experiments had 5 conditions in which the length and timing of naps were manipulated: 0:00-1:00 (E60), 0:00-2:00 (E120), 4:00-5:00 (L60), 4:00-6:00 (L120), and no nap (No-nap). During the night shifts, participants underwent performance tests. A questionnaire on subjective fatigue and a critical flicker fusion frequency test were administered after the performance tests. Heart rate variability and rectal temperature were recorded continuously during the experiments. Polysomnography was also recorded during the nap. RESULTS: Sleep latency was shorter and sleep efficiency was higher in the nap in L60 and L120 than that in E60 and E120. Slow wave sleep in the naps in E120 and L120 was longer than that in E60 and L60. The mean reaction time in L60 became longer after the nap, and faster in E60 and E120. Earlier naps serve to counteract the decrement in performance and physiological functions during night shifts. Performance was somewhat improved by taking a 2-hour nap later in the shift, but deteriorated after a one-hour nap. CONCLUSIONS: Naps in the latter half of the night shift were superior to earlier naps in terms of sleep quality. However performance declined after a 1-hour nap taken later in the night shift due to sleep inertia. This study suggests that appropriate timing of a short nap must be carefully considered, such as a 60-min nap during the night shift.
Resumo:
Distributed generation unlike centralized electrical generation aims to generate electrical energy on small scale as near as possible to load centers, interchanging electric power with the network. This work presents a probabilistic methodology conceived to assist the electric system planning engineers in the selection of the distributed generation location, taking into account the hourly load changes or the daily load cycle. The hourly load centers, for each of the different hourly load scenarios, are calculated deterministically. These location points, properly weighted according to their load magnitude, are used to calculate the best fit probability distribution. This distribution is used to determine the maximum likelihood perimeter of the area where each source distributed generation point should preferably be located by the planning engineers. This takes into account, for example, the availability and the cost of the land lots, which are factors of special relevance in urban areas, as well as several obstacles important for the final selection of the candidates of the distributed generation points. The proposed methodology has been applied to a real case, assuming three different bivariate probability distributions: the Gaussian distribution, a bivariate version of Freund’s exponential distribution and the Weibull probability distribution. The methodology algorithm has been programmed in MATLAB. Results are presented and discussed for the application of the methodology to a realistic case and demonstrate the ability of the proposed methodology for efficiently handling the determination of the best location of the distributed generation and their corresponding distribution networks.
Resumo:
Since the last decade research in Group Decision Making area have been focus in the building of meeting rooms that could support the decision making task and improve the quality of those decisions. However the emergence of Ambient Intelligence concept contributes with a new perspective, a different way of viewing traditional decision rooms. In this paper we will present an overview of Smart Decision Rooms providing Intelligence to the meeting environment, and we will also present LAID, an Ambient Intelligence Environment oriented to support Group Decision Making and some of the software tools that we already have installed in this environment.
Resumo:
Background: A common task in analyzing microarray data is to determine which genes are differentially expressed across two (or more) kind of tissue samples or samples submitted under experimental conditions. Several statistical methods have been proposed to accomplish this goal, generally based on measures of distance between classes. It is well known that biological samples are heterogeneous because of factors such as molecular subtypes or genetic background that are often unknown to the experimenter. For instance, in experiments which involve molecular classification of tumors it is important to identify significant subtypes of cancer. Bimodal or multimodal distributions often reflect the presence of subsamples mixtures. Consequently, there can be genes differentially expressed on sample subgroups which are missed if usual statistical approaches are used. In this paper we propose a new graphical tool which not only identifies genes with up and down regulations, but also genes with differential expression in different subclasses, that are usually missed if current statistical methods are used. This tool is based on two measures of distance between samples, namely the overlapping coefficient (OVL) between two densities and the area under the receiver operating characteristic (ROC) curve. The methodology proposed here was implemented in the open-source R software. Results: This method was applied to a publicly available dataset, as well as to a simulated dataset. We compared our results with the ones obtained using some of the standard methods for detecting differentially expressed genes, namely Welch t-statistic, fold change (FC), rank products (RP), average difference (AD), weighted average difference (WAD), moderated t-statistic (modT), intensity-based moderated t-statistic (ibmT), significance analysis of microarrays (samT) and area under the ROC curve (AUC). On both datasets all differentially expressed genes with bimodal or multimodal distributions were not selected by all standard selection procedures. We also compared our results with (i) area between ROC curve and rising area (ABCR) and (ii) the test for not proper ROC curves (TNRC). We found our methodology more comprehensive, because it detects both bimodal and multimodal distributions and different variances can be considered on both samples. Another advantage of our method is that we can analyze graphically the behavior of different kinds of differentially expressed genes. Conclusion: Our results indicate that the arrow plot represents a new flexible and useful tool for the analysis of gene expression profiles from microarrays.
Resumo:
The magnetic and electrical properties of Ni implanted single crystalline TiO2 rutile were studied for nominal implanted fluences between 0.5 x 10(17) cm(-2) and 2.0 x 10(17) cm(-2) with 150 keV energy, corresponding to maximum atomic concentrations between 9 at% and 27 at% at 65 nm depth, in order to study the formation of metallic oriented aggregates. The results indicate that the as implanted crystals exhibit superparamagnetic behavior for the two higher fluences, which is attributed to the formation of nanosized nickel clusters with an average size related with the implanted concentration, while only paramagnetic behavior is observed for the lowest fluence. Annealing at 1073 K induces the aggregation of the implanted nickel and enhances the magnetization in all samples. The associated anisotropic behavior indicates preferred orientations of the nickel aggregates in the rutile lattice consistent with Rutherford backscattering spectrometry-channelling results. Electrical conductivity displays anisotropic behavior but no magnetoresistive effects were detected. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Bearing in mind the potential adverse health effects of ultrafine particles, it is of paramount importance to perform effective monitoring of nanosized particles in several microenvironments, which may include ambient air, indoor air, and also occupational environments. In fact, effective and accurate monitoring is the first step to obtaining a set of data that could be used further on to perform subsequent evaluations such as risk assessment and epidemiologic studies, thus proposing good working practices such as containment measures in order to reduce occupational exposure. This paper presents a useful methodology for monitoring ultrafine particles/nanoparticles in several microenvironments, using online analyzers and also sampling systems that allow further characterization on collected nanoparticles. This methodology was validated in three case studies presented in the paper, which assess monitoring of nanosized particles in the outdoor atmosphere, during cooking operations, and in a welding workshop.
Resumo:
Dissertação de Mestrado, Supervisão Pedagógica (Educação de Infância), 23 de Abril de 2013, Universidade dos Açores.
Resumo:
Purpose: The aim of this paper is to highlight the importance of qualitative research within the scope of management scientific studies, referring to its philosophy, nature and instruments. It also confronts it with quantitative methodology, approaching its differences as well as its complementariness and synergies, with the purpose of explaining, from a more analytic point of view, the relevance of qualitative methodology in the course of an authentic and real research despite its complexity. Design/methodology/approach: Regardless of its broad application, one may attest the scarcity literature that focuses on qualitative research applied to the management scientific area, as opposed to the large amount that refers to quantitative research. Findings: The paper shows the influence that qualitative research has on management scientific research. Originality/value:. Qualitative research assumes an important role within qualitative research by allowing for the study and analysis of certain types of phenomena that occur inside organisations, and in respect of which quantitative studies cannot provide an answer.
Resumo:
Purpose: The aim of this paper is to promote qualitative methodology within the scientific community of management. The specific objective is oriented to propose an empirical research process based on case study method. This is to ensure rigor in the empirical research process, that future research may follow a similar procedure to that is proposed. Design/methodology/approach: Following a qualitative methodological approach, we propose a research process that develops according to four phases, each with several stages. This study analyses the preparatory and field work phases and their stages. Findings: The paper shows the influence that case studies have on qualitative empirical research process in management. Originality/value:. Case study method assumes an important role within qualitative research by allowing for the study and analysis of certain types of phenomena that occur inside organisations, and in respect of which quantitative studies cannot provide an answer.
Resumo:
A QuEChERS method for the extraction of ochratoxin A (OTA) from bread samples was evaluated. A factorial design (23) was used to find the optimal QuEChERS parameters (extraction time, extraction solvent volume and sample mass). Extracts were analysed by LC with fluorescence detection. The optimal extraction conditions were: 5 g of sample, 15 mL of acetonitrile and 3 min of agitation. The extraction procedure was validated by systematic recovery experiments at three levels. The recoveries obtained ranged from 94.8% (at 1.0 μg kg -1) to 96.6% (at 3.0 μg kg -1). The limit of quantification of the method was 0.05 μg kg -1. The optimised procedure was applied to 20 samples of different bread types (‘‘Carcaça’’, ‘‘Broa de Milho’’, and ‘‘Broa de Avintes’’) highly consumed in Portugal. None of the samples exceeded the established European legal limit of 3 μg kg -1.