7 resultados para contribution analysis
em Instituto Politécnico do Porto, Portugal
Resumo:
The impact of effluent wastewaters from four different hospitals: a university (1456 beds), a general (350 beds), a pediatric (110 beds) and a maternity hospital (96 beds), which are conveyed to the same wastewater treatment plant (WWTP), was evaluated in the receiving urban wastewaters. The occurrence of 78 pharmaceuticals belonging to several therapeutic classes was assessed in hospital effluents and WWTP wastewaters (influent and effluent) as well as the contribution of each hospital in WWTP influent in terms of pharmaceutical load. Results indicate that pharmaceuticals are widespread pollutants in both hospital and urban wastewaters. The contribution of hospitals to the input of pharmaceuticals in urban wastewaters widely varies, according to their dimension. The estimated total mass loadings were 306 g d− 1 for the university hospital, 155 g d− 1 for the general one, 14 g d− 1 for the pediatric hospital and 1.5 g d− 1 for the maternity hospital, showing that the biggest hospitals have a greater contribution to the total mass load of pharmaceuticals. Furthermore, analysis of individual contributions of each therapeutic group showed that NSAIDs, analgesics and antibiotics are among the groups with the highest inputs. Removal efficiency can go from over 90% for pharmaceuticals like acetaminophen and ibuprofen to not removal for β-blockers and salbutamol. Total mass load of pharmaceuticals into receiving surface waters was estimated between 5 and 14 g/d/1000 inhabitants. Finally, the environmental risk posed by pharmaceuticals detected in hospital and WWTP effluents was assessed by means of hazard quotients toward different trophic levels (algae, daphnids and fish). Several pharmaceuticals present in the different matrices were identified as potentially hazardous to aquatic organisms, showing that especial attention should be paid to antibiotics such as ciprofloxacin, ofloxacin, sulfamethoxazole, azithromycin and clarithromycin, since their hazard quotients in WWTP effluent revealed that they could pose an ecotoxicological risk to algae.
Resumo:
On-chip debug (OCD) features are frequently available in modern microprocessors. Their contribution to shorten the time-to-market justifies the industry investment in this area, where a number of competing or complementary proposals are available or under development, e.g. NEXUS, CJTAG, IJTAG. The controllability and observability features provided by OCD infrastructures provide a valuable toolbox that can be used well beyond the debugging arena, improving the return on investment rate by diluting its cost across a wider spectrum of application areas. This paper discusses the use of OCD features for validating fault tolerant architectures, and in particular the efficiency of various fault injection methods provided by enhanced OCD infrastructures. The reference data for our comparative study was captured on a workbench comprising the 32-bit Freescale MPC-565 microprocessor, an iSYSTEM IC3000 debugger (iTracePro version) and the Winidea 2005 debugging package. All enhanced OCD infrastructures were implemented in VHDL and the results were obtained by simulation within the same fault injection environment. The focus of this paper is on the comparative analysis of the experimental results obtained for various OCD configurations and debugging scenarios.
Resumo:
The aim of this study is to optimize the heat flow through the pultrusion die assembly system on the manufacturing process of a specific glass-fiber reinforced polymer (GFRP) pultrusion profile. The control of heat flow and its distribution through whole die assembly system is of vital importance in optimizing the actual GFRP pultrusion process. Through mathematical modeling of heating-die process, by means of Finite Element Analysis (FEA) program, an optimum heater selection, die position and temperature control was achieved. The thermal environment within the die was critically modeled relative not only to the applied heat sources, but also to the conductive and convective losses, as well as the thermal contribution arising from the exothermic reaction of resin matrix as it cures or polymerizes from the liquid to solid condition. Numerical simulation was validated with basis on thermographic measurements carried out on key points along the die during pultrusion process.
Resumo:
This paper presents the creation and development of technological schools directly linked to the business community and to higher public education. Establishing themselves as the key interface between the two sectors they make a signigicant contribution by having a greater competitive edge when faced with increasing competition in the tradional markets. The development of new business strategies supported by references of excellence, quality and competitiveness also provides a good link between the estalishment of partnerships aiming at the qualification of education boards at a medium level between the technological school and higher education with a technological foundation. We present a case study as an example depicting the success of Escola Tecnológica de Vale de Cambra.
Resumo:
In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results.
Resumo:
Hard real- time multiprocessor scheduling has seen, in recent years, the flourishing of semi-partitioned scheduling algorithms. This category of scheduling schemes combines elements of partitioned and global scheduling for the purposes of achieving efficient utilization of the system’s processing resources with strong schedulability guarantees and with low dispatching overheads. The sub-class of slot-based “task-splitting” scheduling algorithms, in particular, offers very good trade-offs between schedulability guarantees (in the form of high utilization bounds) and the number of preemptions/migrations involved. However, so far there did not exist unified scheduling theory for such algorithms; each one was formulated in its own accompanying analysis. This article changes this fragmented landscape by formulating a more unified schedulability theory covering the two state-of-the-art slot-based semi-partitioned algorithms, S-EKG and NPS-F (both fixed job-priority based). This new theory is based on exact schedulability tests, thus also overcoming many sources of pessimism in existing analysis. In turn, since schedulability testing guides the task assignment under the schemes in consideration, we also formulate an improved task assignment procedure. As the other main contribution of this article, and as a response to the fact that many unrealistic assumptions, present in the original theory, tend to undermine the theoretical potential of such scheduling schemes, we identified and modelled into the new analysis all overheads incurred by the algorithms in consideration. The outcome is a new overhead-aware schedulability analysis that permits increased efficiency and reliability. The merits of this new theory are evaluated by an extensive set of experiments.
Resumo:
C3S2E '16 Proceedings of the Ninth International C* Conference on Computer Science & Software Engineering