908 resultados para case-method
Resumo:
This paper is on the problem of short-term hydro scheduling, particularly concerning head-dependent reservoirs under competitive environment. We propose a new nonlinear optimization method to consider hydroelectric power generation as a function of water discharge and also of the head. Head-dependency is considered on short-term hydro scheduling in order to obtain more realistic and feasible results. The proposed method has been applied successfully to solve a case study based on one of the main Portuguese cascaded hydro systems, providing a higher profit at a negligible additional computation time in comparison with a linear optimization method that ignores head-dependency.
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
A descriptive study was developed in order to assess air contamination caused by fungi and particles in seven poultry units. Twenty seven air samples of 25 litters were collected through impaction method. Air sampling and particle concentration measurement were performed in the pavilions’ interior and also outside premises, since this was the place regarded as reference. Simultaneously, temperature and relative humidity were also registered. Regarding fungal load in the air from the seven poultry farms, the highest value obtained was 24040 CFU/m3 and the lowest was 320 CFU/m3. Twenty eight species/genera of fungi were identified, being Scopulariopsis brevicaulis (39.0%) the most commonly isolated species and Rhizopus sp. (30.0%) the most commonly isolated genus. From the Aspergillus genus, Aspergillus flavus (74.5%) was the most frequently detected species. There was a significant correlation (r=0.487; p=0.014) between temperature and the level of fungal contamination (CFU/m3). Considering contamination caused by particles, in this study, particles with larger dimensions (PM5.0 and PM10) have higher concentrations. There was also a significant correlation between relative humidity and concentration of smaller particles namely, PM0.5 (r=0.438; p=0.025) and PM1.0 (r=0.537; p=0.005). Characterizing typical exposure levels to these contaminants in this specific occupational setting is required to allow a more detailed risk assessment analysis and to set exposure limits to protect workers’ health.
Resumo:
Learning is not a spectator’s sport. Students do not learn much by just sitting in class listening their teachers, memorizing pre-packaged assignments and spitting out answers. The teaching-learning process has been a constant target of studies, particularly in Higher Education, in consequence of the annual increase of new students. The concern with maintaining a desired quality level in the training of these students, conjugated with the will to widen the access to all of those who finish Secondary School Education, has triggered a greater intervention from the education specialists, in partnership with the teachers of all Higher Education areas, in the analysis of this problem. Considering the particular case of Engineering, it has been witnessed a rising concern with the active learning strategies and forms of assessment. Research has demonstrated that students learn more if they are actively engaged with the material they are studying. In this presentation we describe, present and discuss the techniques and the results of Peer Instruction method in an introductory Calculus courses of an Engineering Bach
Resumo:
Storm- and tsunami-deposits are generated by similar depositional mechanisms making their discrimination hard to establish using classic sedimentologic methods. Here we propose an original approach to identify tsunami-induced deposits by combining numerical simulation and rock magnetism. To test our method, we investigate the tsunami deposit of the Boca do Rio estuary generated by the 1755 earthquake in Lisbon which is well described in the literature. We first test the 1755 tsunami scenario using a numerical inundation model to provide physical parameters for the tsunami wave. Then we use concentration (MS. SIRM) and grain size (chi(ARM), ARM, B1/2, ARM/SIRM) sensitive magnetic proxies coupled with SEM microscopy to unravel the magnetic mineralogy of the tsunami-induced deposit and its associated depositional mechanisms. In order to study the connection between the tsunami deposit and the different sedimentologic units present in the estuary, magnetic data were processed by multivariate statistical analyses. Our numerical simulation show a large inundation of the estuary with flow depths varying from 0.5 to 6 m and run up of similar to 7 m. Magnetic data show a dominance of paramagnetic minerals (quartz) mixed with lesser amount of ferromagnetic minerals, namely titanomagnetite and titanohematite both of a detrital origin and reworked from the underlying units. Multivariate statistical analyses indicate a better connection between the tsunami-induced deposit and a mixture of Units C and D. All these results point to a scenario where the energy released by the tsunami wave was strong enough to overtop and erode important amount of sand from the littoral dune and mixed it with reworked materials from underlying layers at least 1 m in depth. The method tested here represents an original and promising tool to identify tsunami-induced deposits in similar embayed beach environments.
Resumo:
Introduction / Aims: Adopting the important decisions represents a specific task of the manager. An efficient manager takes these decisions during a sistematic process with well-defined elements, each with a precise order. In the pharmaceutical practice and business, in the supply process of the pharmacies, there are situations when the medicine distributors offer a certain discount, but require payment in a shorter period of time. In these cases, the analysis of the offer can be made with the help of the decision tree method, which permits identifying the decision offering the best possible result in a given situation. The aims of the research have been the analysis of the product offers of many different suppliers and the establishing of the most advantageous ways of pharmacy supplying. Material / Methods: There have been studied the general product offers of the following medical stores: A&G Med, Farmanord, Farmexim, Mediplus, Montero and Relad. In the case of medicine offers including a discount, the decision tree method has been applied in order to select the most advantageous offers. The Decision Tree is a management method used in taking the right decisions and it is generally used when one needs to evaluate the decisions that involve a series of stages. The tree diagram is used in order to look for the most efficient means to attain a specific goal. The decision trees are the most probabilistic methods, useful when adopting risk taking decisions. Results: The results of the analysis on the tree diagrams have indicated the fact that purchasing medicines with discount (1%, 10%, 15%) and payment in a shorter time interval (120 days) is more profitable than purchasing without a discount and payment in a longer time interval (160 days). Discussion / Conclusion: Depending on the results of the tree diagram analysis, the pharmacies would purchase from the selected suppliers. The research has shown that the decision tree method represents a valuable work instrument in choosing the best ways for supplying pharmacies and it is very useful to the specialists from the pharmaceutical field, pharmaceutical management, to medicine suppliers, pharmacy practitioners from the community pharmacies and especially to pharmacy managers, chief – pharmacists.
Resumo:
We investigate the phase behaviour of 2D mixtures of bi-functional and three-functional patchy particles and 3D mixtures of bi-functional and tetra-functional patchy particles by means of Monte Carlo simulations and Wertheim theory. We start by computing the critical points of the pure systems and then we investigate how the critical parameters change upon lowering the temperature. We extend the successive umbrella sampling method to mixtures to make it possible to extract information about the phase behaviour of the system at a fixed temperature for the whole range of densities and compositions of interest. (C) 2013 AIP Publishing LLC.
Resumo:
Formaldehyde (FA) ranks 25th in the overall U.S. chemical production, with more than 5 million tons produced each year. Given its economic importance and widespread use, many people are exposed to FA occupationally. Recently, based on the correlation with nasopharyngeal cancer in humans, the International Agency for Research on Cancer (IARC) confirmed the classification of FA as a Group I substance. Considering the epidemiological evidence of a potential association with leukemia, the IARC has concluded that FA can cause this lymphoproliferative disorder. Our group has developed a method to assess the exposure and genotoxicity effects of FA in two different occupational settings, namely FAbased resins production and pathology and anatomy laboratories. For exposure assessment we applied simultaneously two different techniques of air monitoring: NIOSH Method 2541 and Photo Ionization Detection Equipment with simultaneously video recording. Genotoxicity effects were measured by cytokinesis-blocked micronucleus assay in peripheral blood lymphocytes and by micronucleus test in exfoliated oral cavity epithelial cells, both considered target cells. The two exposure assessment techniques show that in the two occupational settings peak exposures are still occurring. There was a statistical significant increase in the micronucleus mean of epithelial cells and peripheral lymphocytes of exposed individuals compared with controls. In conclusion, the exposure and genotoxicity effects assessment methodologies developed by us allowed to determine that these two occupational settings promote exposure to high peak FA concentrations and an increase in the micronucleus mean of exposed workers. Moreover, the developed techniques showed promising results and could be used to confirm and extend the results obtained by the analytical techniques currently available.
Resumo:
In the initial stage of this work, two potentiometric methods were used to determine the salt (sodium chloride) content in bread and dough samples from several cities in the north of Portugal. A reference method (potentiometric precipitation titration) and a newly developed ion-selective chloride electrode (ISE) were applied. Both methods determine the sodium chloride content through the quantification of chloride. To evaluate the accuracy of the ISE, bread and respective dough samples were analyzed by both methods. Statistical analysis (0.05 significance level) indicated that the results of these methods did not differ significantly. Therefore the ISE is an adequate alternative for the determination of chloride in the analyzed samples. To compare the results of these chloride-based methods with a sodium-based method, sodium was quantified in the same samples by a reference method (atomic absorption spectrometry). Significant differences between the results were verified. In several cases the sodium chloride content exceeded the legal limit when the chloride-based methods were used, but when the sodium-based method was applied this was not the case. This could lead to the erroneous application of fines and therefore the authorities should supply additional information regarding the analytical procedure for this particular control.
Resumo:
Purpose: The aim of this paper is to promote qualitative methodology within the scientific community of management. The specific objective is oriented to propose an empirical research process based on case study method. This is to ensure rigor in the empirical research process, that future research may follow a similar procedure to that is proposed. Design/methodology/approach: Following a qualitative methodological approach, we propose a research process that develops according to four phases, each with several stages. This study analyses the preparatory and field work phases and their stages. Findings: The paper shows the influence that case studies have on qualitative empirical research process in management. Originality/value:. Case study method assumes an important role within qualitative research by allowing for the study and analysis of certain types of phenomena that occur inside organisations, and in respect of which quantitative studies cannot provide an answer.
Resumo:
The handling of waste can be responsible for occupational exposure to particles and fungi. The aim of this study was to characterize exposure to particles and fungi in a composting plant. Measurements of particulate matter were performed using portable direct-reading equipment. Air samples of 50L were collected through an impaction method with a flow rate of 140L/min onto malt extract agar supplemented with chloramphenicol (0.05%). Surfaces samples were also collected. All the samples were incubated at 27ºC for 5 to 7 days. Particulate matter data showed higher contamination for PM, and PM10 sizes. Aspergillus genus presents the highest air prevalence (90.6%). Aspergillus niger (32.6%), A. fumigatus (26.5%) and A. flavus (16.3%) were the most prevalent fungi in air sampling, and Mucor sp. (39.2%), Aspergillus niger (30.9%) and A. fumigatus (28.7%) were the most found in surfaces. the results obtained claim the attention to the need of further research.
Resumo:
Organic waste is a rich substrate for microbial growth, and because of that, workers from waste industry are at higher risk of exposure to bioaerosols. This study aimed to assess fungal contamination in two plants handling solid waste management. Air samples from the two plants were collected through an impaction method. Surface samples were also collected by swabbing surfaces of the same indoor sites. All collected samples were incubated at 27◦C for 5 to 7 d. After lab processing and incubation of collected samples, quantitative and qualitative results were obtained with identification of the isolated fungal species. Air samples were also subjected to molecular methods by real-time polymerase chain reaction (RT PCR) using an impinger method to measure DNA of Aspergillus flavus complex and Stachybotrys chartarum. Assessment of particulate matter (PM) was also conducted with portable direct-reading equipment. Particles concentration measurement was performed at five different sizes (PM0.5; PM1; PM2.5; PM5; PM10). With respect to the waste sorting plant, three species more frequently isolated in air and surfaces were A. niger (73.9%; 66.1%), A. fumigatus (16%; 13.8%), and A. flavus (8.7%; 14.2%). In the incineration plant, the most prevalent species detected in air samples were Penicillium sp. (62.9%), A. fumigatus (18%), and A. flavus (6%), while the most frequently isolated in surface samples were Penicillium sp. (57.5%), A. fumigatus (22.3%) and A. niger (12.8%). Stachybotrys chartarum and other toxinogenic strains from A. flavus complex were not detected. The most common PM sizes obtained were the PM10 and PM5 (inhalable fraction). Since waste is the main internal fungal source in the analyzed settings, preventive and protective measures need to be maintained to avoid worker exposure to fungi and their metabolites.
Resumo:
The development of new products or processes involves the creation, re-creation and integration of conceptual models from the related scientific and technical domains. Particularly, in the context of collaborative networks of organisations (CNO) (e.g. a multi-partner, international project) such developments can be seriously hindered by conceptual misunderstandings and misalignments, resulting from participants with different backgrounds or organisational cultures, for example. The research described in this article addresses this problem by proposing a method and the tools to support the collaborative development of shared conceptualisations in the context of a collaborative network of organisations. The theoretical model is based on a socio-semantic perspective, while the method is inspired by the conceptual integration theory from the cognitive semantics field. The modelling environment is built upon a semantic wiki platform. The majority of the article is devoted to developing an informal ontology in the context of a European R&D project, studied using action research. The case study results validated the logical structure of the method and showed the utility of the method.
Resumo:
Graphics processor units (GPUs) today can be used for computations that go beyond graphics and such use can attain a performance that is orders of magnitude greater than a normal processor. The software executing on a graphics processor is composed of a set of (often thousands of) threads which operate on different parts of the data and thereby jointly compute a result which is delivered to another thread executing on the main processor. Hence the response time of a thread executing on the main processor is dependent on the finishing time of the execution of threads executing on the GPU. Therefore, we present a simple method for calculating an upper bound on the finishing time of threads executing on a GPU, in particular NVIDIA Fermi. Developing such a method is nontrivial because threads executing on a GPU share hardware resources at very fine granularity.
Resumo:
Dissertation submitted in partial fulfilment of the requirements for the Degree of Master of Science in Geospatial Technologies