826 resultados para Stack Overflow
Resumo:
A crescente expansão urbana e o incremento das exigências ambientais e financeiras promovem a implementação de abordagens sustentáveis para a gestão das infraestruturas sanitárias. Assim, o recurso a instrumentos de monitorização e à modelação matemática surge como o caminho para a racionalização do investimento e a otimização dos sistemas existentes. Neste contexto, a modelação dinâmica de sistemas de drenagem urbana assume relevância para o controlo e redução dos caudais em excesso e das descargas de poluentes nos meios recetores, resultantes de um incremento significativo de afluências pluviais indevidas, de problemas de sub-dimensionamento ou falta de operação e manutenção. O objetivo da presente dissertação consiste na modelação, calibração e diagnóstico do sistema intercetor de Lordelo utilizando o software Storm Water Management Model, através dos dados recolhidos a partir do projeto de Reabilitação dos intercetores de Lordelo, elaborado pela Noraqua. A modelação considera a avaliação das afluências de tempo seco e as afluências pluviais pelo software Sanitary Sewer Overflow Analysis and Planning Toolbox. Com efeito, a simulação dinâmica, permitiu um conhecimento mais detalhado do sistema, avaliando a capacidade hidráulica e localizando os pontos propícios a inundações. Assim, foi possível testar soluções de beneficiação do sistema, englobando a problemática das afluências pluviais indevidas calibradas. Apesar das dificuldades sentidas face à qualidade dos dados existentes, verificou-se que o SSOAP e o SWMM são ferramentas úteis na deteção, diagnóstico e redução dos caudais em excesso e que o procedimento utilizado pode ser aplicado a sistemas semelhantes, como forma de definir a melhor solução técnica e económica ao nível do planeamento, operação e reabilitação do sistema.
Resumo:
Recent embedded processor architectures containing multiple heterogeneous cores and non-coherent caches renewed attention to the use of Software Transactional Memory (STM) as a building block for developing parallel applications. STM promises to ease concurrent and parallel software development, but relies on the possibility of abort conflicting transactions to maintain data consistency, which in turns affects the execution time of tasks carrying transactions. Because of this fact the timing behaviour of the task set may not be predictable, thus it is crucial to limit the execution time overheads resulting from aborts. In this paper we formalise a FIFO-based algorithm to order the sequence of commits of concurrent transactions. Then, we propose and evaluate two non-preemptive and one SRP-based fully-preemptive scheduling strategies, in order to avoid transaction starvation.
Resumo:
Presented at Embed with Linux Workshop (EWiLi 2015). 4 to 9, Oct, 2015. Amsterdam, Netherlands.
Resumo:
The 6loWPAN (the light version of IPv6) and RPL (routing protocol for low-power and lossy links) protocols have become de facto standards for the Internet of Things (IoT). In this paper, we show that the two native algorithms that handle changes in network topology – the Trickle and Neighbor Discovery algorithms – behave in a reactive fashion and thus are not prepared for the dynamics inherent to nodes mobility. Many emerging and upcoming IoT application scenarios are expected to impose real-time and reliable mobile data collection, which are not compatible with the long message latency, high packet loss and high overhead exhibited by the native RPL/6loWPAN protocols. To solve this problem, we integrate a proactive hand-off mechanism (dubbed smart-HOP) within RPL, which is very simple, effective and backward compatible with the standard protocol. We show that this add-on halves the packet loss and reduces the hand-off delay dramatically to one tenth of a second, upon nodes’ mobility, with a sub-percent overhead. The smart-HOP algorithm has been implemented and integrated in the Contiki 6LoWPAN/RPL stack (source-code available on-line mrpl: smart-hop within rpl, 2014) and validated through extensive simulation and experimentation.
Resumo:
Until this day, the most efficient Cu(In,Ga)Se2 thin film solar cells have been prepared using a rather complex growth process often referred to as three-stage or multistage. This family of processes is mainly characterized by a first step deposited with only In, Ga and Se flux to form a first layer. Cu is added in a second step until the film becomes slightly Cu-rich, where-after the film is converted to its final Cu-poor composition by a third stage, again with no or very little addition of Cu. In this paper, a comparison between solar cells prepared with the three-stage process and a one-stage/in-line process with the same composition, thickness, and solar cell stack is made. The one-stage process is easier to be used in an industrial scale and do not have Cu-rich transitions. The samples were analyzed using glow discharge optical emission spectroscopy, scanning electron microscopy, X-ray diffraction, current–voltage-temperature, capacitance-voltage, external quantum efficiency, transmission/reflection, and photoluminescence. It was concluded that in spite of differences in the texturing, morphology and Ga gradient, the electrical performance of the two types of samples is quite similar as demonstrated by the similar J–V behavior, quantum spectral response, and the estimated recombination losses.
Resumo:
O crescente interesse pela área de Business Intelligence (BI) tem origem no reconhecimento da sua importância pelas organizações, como poderoso aliado dos processos de tomada de decisão. O BI é um conceito dinâmico, que se amplia à medida que são integradas novas ferramentas, em resposta a necessidades emergentes dos mercados. O BI não constitui, ainda, uma realidade nas pequenas e médias empresas, sendo, até, desconhecido para muitas. São, essencialmente, as empresas de maior dimensão, com presença em diferentes mercados e/ou áreas de negócio mais abrangentes, que recorrem a estas soluções. A implementação de ferramentas BI nas organizações depende, pois, das especificidades destas, sendo fundamental que a informação sobre as plataformas disponíveis e suas funcionalidades seja objetiva e inequívoca. Só uma escolha correta, que responda às necessidades da área de negócio desenvolvida, permitirá obter dados que resultem em ganhos, potenciando a vantagem competitiva empresarial. Com este propósito, efectua-se, na presente dissertação, uma análise comparativa das funcionalidades existentes em diversas ferramentas BI, que se pretende que venha auxiliar os processos de seleção da plataforma BI mais adaptada a cada organização e/ou negócio. As plataformas BI enquadram-se em duas grandes vertentes, as que implicam custos de aquisição, de índole comercial, e as disponibilizadas de forma livre, ou em código aberto, designadas open source. Neste sentido, equaciona-se se estas últimas podem constituir uma opção válida para as empresas com recursos mais escassos. Num primeiro momento, procede-se à implementação de tecnologias BI numa organização concreta, a operar na indústria de componentes automóveis, a Yazaki Saltano de Ovar Produtos Eléctricos, Ltd., implantada em Portugal há mais de 25 anos. Para esta empresa, o desenvolvimento de soluções com recurso a ferramentas BI afigura-se como um meio adequado de melhorar o acompanhamento aos seus indicadores de performance. Este processo concretizou-se a partir da stack tecnológica pré-existente na organização, a plataforma BI comercial da Microsoft. Com o objetivo de, por um lado, reunir contributos que possibilitem elucidar as organizações na escolha da plataforma BI mais adequada e, por outro, compreender se as plataformas open source podem constituir uma alternativa credível às plataformas comerciais, procedeu-se a uma pesquisa comparativa das funcionalidades das várias plataformas BI open source. Em resultado desta análise, foram selecionadas duas plataformas, a SpagoBI e a PentahoBI, utilizadas na verificação do potencial alternativo das open source face às plataformas comerciais. Com base nessas plataformas, reproduziu-se os processos e procedimentos desenvolvidos no âmbito do projeto de implementação BI realizado na empresa Yazaki Saltano.
Resumo:
A procura por alternativas ao atual paradigma energético, que se caracteriza por uma predominância indiscutível das fontes combustíveis fósseis, é o motivo primário desta investigação. A energia emitida pelo Sol que chega à Terra diariamente ultrapassa em várias ordens de grandeza a energia que a nossa sociedade atual necessita. O efeito chaminé é uma das formas de aproveitar essa energia. Este efeito tem origem no diferencial de temperaturas existente entre o interior e o exterior de uma chaminé, que provoca um gradiente nas massas volúmicas do fluido entre o interior e o exterior da chaminé, induzindo assim um fluxo de ar. Esta diferença de temperaturas radica na exposição da face exterior da chaminé à radiação solar. No sistema que nos propomos estudar, o ar entra na chaminé por pequenos orifícios situados na sua base, e, ao tomar contacto com as paredes internas da chaminé, aquece desde a temperatura ambiente, Ta, até à temperatura interna, Ti . Este aumento de temperatura torna o ar dentro da chaminé mais “leve” em comparação com o ar mais frio do exterior levando-o a ascender ao longo do interior da chaminé. Este escoamento contém energia cinética que pode, por exemplo, ser transformada em energia elétrica por intermédio de turbinas. A eficiência de conversão da energia será tanto maior quanto menor for a velocidade do ar a jusante da turbina. Esta tecnologia poderá ser instalada de forma descentralizada, como acontece com as atuais centrais concentradoras solares térmicas e fotovoltaicas localizadas na periferia de grandes cidades ou, alternativamente, poderá ser inserida no próprio tecido urbanístico. A investigação demonstra que as dimensões da chaminé, a irradiação e a temperatura do ar são os fatores com maior impacto na potência hidráulica gerada.
Resumo:
Minimum parking requirements are the norm for urban and suburban development in the United States (Davidson and Dolnick (2002)). The justification for parking space requirements is that overflow parking will occupy nearby street or off-street parking. Shoup (1999) and Willson (1995) provides cases where there is reason to believe that parking space requirements have forced parcel developers to place more parking than they would in the absence of parking requirements. If the effect of parking minimums is to significantly increase the land area devoted to parking, then the increase in impervious surfaces would likely cause water quality degradation, increased flooding, and decreased groundwater recharge. However, to our knowledge the existing literature does not test the effect of parking minimums on the amount of lot space devoted to parking beyond a few case studies. This paper tests the hypothesis that parking space requirements cause an oversupply of parking by examining the implicit marginal value of land allocated to parking spaces. This is an indirect test of the effects of parking requirements that is similar to Glaeser and Gyourko (2003). A simple theoretical model shows that the marginal value of additional parking to the sale price should be equal to the cost of land plus the cost of parking construction. We estimate the marginal values of parking and lot area with spatial methods using a large data set from the Los Angeles area non-residential property sales and find that for most of the property types the marginal value of parking is significantly below that of the parcel area. This evidence supports the contention that minimum parking requirements significantly increase the amount of parcel area devoted to parking.
Resumo:
OutSystems Platform is used to develop, deploy, and maintain enterprise web an mobile web applications. Applications are developed through a visual domain specific language, in an integrated development environment, and compiled to a standard stack of web technologies. In the platform’s core, there is a compiler and a deployment service that transform the visual model into a running web application. As applications grow, compilation and deployment times increase as well, impacting the developer’s productivity. In the previous model, a full application was the only compilation and deployment unit. When the developer published an application, even if he only changed a very small aspect of it, the application would be fully compiled and deployed. Our goal is to reduce compilation and deployment times for the most common use case, in which the developer performs small changes to an application before compiling and deploying it. We modified the OutSystems Platform to support a new incremental compilation and deployment model that reuses previous computations as much as possible in order to improve performance. In our approach, the full application is broken down into smaller compilation and deployment units, increasing what can be cached and reused. We also observed that this finer model would benefit from a parallel execution model. Hereby, we created a task driven Scheduler that executes compilation and deployment tasks in parallel. Our benchmarks show a substantial improvement of the compilation and deployment process times for the aforementioned development scenario.
Resumo:
A potentially renewable and sustainable source of energy is the chemical energy associated with solvation of salts. Mixing of two aqueous streams with different saline concentrations is spontaneous and releases energy. The global theoretically obtainable power from salinity gradient energy due to World’s rivers discharge into the oceans has been estimated to be within the range of 1.4-2.6 TW. Reverse electrodialysis (RED) is one of the emerging, membrane-based, technologies for harvesting the salinity gradient energy. A common RED stack is composed by alternately-arranged cation- and anion-exchange membranes, stacked between two electrodes. The compartments between the membranes are alternately fed with concentrated (e.g., sea water) and dilute (e.g., river water) saline solutions. Migration of the respective counter-ions through the membranes leads to ionic current between the electrodes, where an appropriate redox pair converts the chemical salinity gradient energy into electrical energy. Given the importance of the need for new sources of energy for power generation, the present study aims at better understanding and solving current challenges, associated with the RED stack design, fluid dynamics, ionic mass transfer and long-term RED stack performance with natural saline solutions as feedwaters. Chronopotentiometry was used to determinate diffusion boundary layer (DBL) thickness from diffusion relaxation data and the flow entrance effects on mass transfer were found to avail a power generation increase in RED stacks. Increasing the linear flow velocity also leads to a decrease of DBL thickness but on the cost of a higher pressure drop. Pressure drop inside RED stacks was successfully simulated by the developed mathematical model, in which contribution of several pressure drops, that until now have not been considered, was included. The effect of each pressure drop on the RED stack performance was identified and rationalized and guidelines for planning and/or optimization of RED stacks were derived. The design of new profiled membranes, with a chevron corrugation structure, was proposed using computational fluid dynamics (CFD) modeling. The performance of the suggested corrugation geometry was compared with the already existing ones, as well as with the use of conductive and non-conductive spacers. According to the estimations, use of chevron structures grants the highest net power density values, at the best compromise between the mass transfer coefficient and the pressure drop values. Finally, long-term experiments with natural waters were performed, during which fouling was experienced. For the first time, 2D fluorescence spectroscopy was used to monitor RED stack performance, with a dedicated focus on following fouling on ion-exchange membrane surfaces. To extract relevant information from fluorescence spectra, parallel factor analysis (PARAFAC) was performed. Moreover, the information obtained was then used to predict net power density, stack electric resistance and pressure drop by multivariate statistical models based on projection to latent structures (PLS) modeling. The use in such models of 2D fluorescence data, containing hidden, but extractable by PARAFAC, information about fouling on membrane surfaces, considerably improved the models fitting to the experimental data.
Resumo:
According to a recent Eurobarometer survey (2014), 68% of Europeans tend not to trust national governments. As the increasing alienation of citizens from politics endangers democracy and welfare, governments, practitioners and researchers look for innovative means to engage citizens in policy matters. One of the measures intended to overcome the so-called democratic deficit is the promotion of civic participation. Digital media proliferation offers a set of novel characteristics related to interactivity, ubiquitous connectivity, social networking and inclusiveness that enable new forms of societal-wide collaboration with a potential impact on leveraging participative democracy. Following this trend, e-Participation is an emerging research area that consists in the use of Information and Communication Technologies to mediate and transform the relations among citizens and governments towards increasing citizens’ participation in public decision-making. However, despite the widespread efforts to implement e-Participation through research programs, new technologies and projects, exhaustive studies on the achieved outcomes reveal that it has not yet been successfully incorporated in institutional politics. Given the problems underlying e-Participation implementation, the present research suggested that, rather than project-oriented efforts, the cornerstone for successfully implementing e-Participation in public institutions as a sustainable added-value activity is a systematic organisational planning, embodying the principles of open-governance and open-engagement. It further suggested that BPM, as a management discipline, can act as a catalyst to enable the desired transformations towards value creation throughout the policy-making cycle, including political, organisational and, ultimately, citizen value. Following these findings, the primary objective of this research was to provide an instrumental model to foster e-Participation sustainability across Government and Public Administration towards a participatory, inclusive, collaborative and deliberative democracy. The developed artefact, consisting in an e-Participation Organisational Semantic Model (ePOSM) underpinned by a BPM-steered approach, introduces this vision. This approach to e-Participation was modelled through a semi-formal lightweight ontology stack structured in four sub-ontologies, namely e-Participation Strategy, Organisational Units, Functions and Roles. The ePOSM facilitates e-Participation sustainability by: (1) Promoting a common and cross-functional understanding of the concepts underlying e-Participation implementation and of their articulation that bridges the gap between technical and non-technical users; (2) Providing an organisational model which allows a centralised and consistent roll-out of strategy-driven e-Participation initiatives, supported by operational units dedicated to the execution of transformation projects and participatory processes; (3) Providing a standardised organisational structure, goals, functions and roles related to e-Participation processes that enhances process-level interoperability among government agencies; (4) Providing a representation usable in software development for business processes’ automation, which allows advanced querying using a reasoner or inference engine to retrieve concrete and specific information about the e-Participation processes in place. An evaluation of the achieved outcomes, as well a comparative analysis with existent models, suggested that this innovative approach tackling the organisational planning dimension can constitute a stepping stone to harness e-Participation value.
Resumo:
One of the authors (S.M.) acknowledges Direction des Relations Extérieures of Ecole Polytechnique for financial support.
Resumo:
Dissertação de mestrado integrado em Engenharia Civil
Resumo:
Aughinish Alumina Limited (AAL) have an obligation by terms of their Integrated Pollution Control Licence (IPCL) and Planning Permission to establish vegetation on the red mud stack at their plant at Aughinish, Co. Limerick. High pH and high exchangeable sodium percentage are the main known factors limiting the establishment of vegetation on red mud. Gypsum addition has been known to assist in alleviating these problems in other countries. However, there is no experience or published information on red mud rehabilitation under Irish conditions. Red mud with organic and inorganic waste-derived ameliorants as well as selected grassland species were examined under laboratory controlled environment conditions as well as in field plot trials. Also, in order that it would be economically achievable, the research utilised locally available waste products as the organic amendments. Screening trials found that physical constraints severely limit plant germination and growth in red mud. Gypsum addition effectively lowers pH, exchangeable sodium percentage and the availability of A1 and Fe in the mud. A strong relationship between pH, ESP and A1 levels was also found. Gypsum addition increased germination percentages and plant growth for all species investigated. Greenhouse trials demonstrated that organic wastes alone did not greatly improve conditions for plant growth but when used in conjunction with gypsum plant performances for all species investigated was significantly increased. There was a high mortality rate for grasses in non-gypsum treatments. An emerging trend of preferential iron uptake and calcium deficiency in non-gypsum treatments was found at pot screening stage. Species also displayed manganese and magnesium deficiencies.
Resumo:
The Stanley lattice, Tamari lattice and Kreweras lattice are three remarkable orders defined on the set of Catalan objects of a given size. These lattices are ordered by inclusion: the Stanley lattice is an extension of the Tamari lattice which is an extension of the Kreweras lattice. The Stanley order can be defined on the set of Dyck paths of size n as the relation of being above. Hence, intervals in the Stanley lattice are pairs of non-crossing Dyck paths. In a former article, the second author defined a bijection Φ between pairs of non-crossing Dyck paths and the realizers of triangulations (or Schnyder woods). We give a simpler description of the bijection Φ. Then, we study the restriction of Φ to Tamari’s and Kreweras’ intervals. We prove that Φ induces a bijection between Tamari intervals and minimal realizers. This gives a bijection between Tamari intervals and triangulations. We also prove that Φ induces a bijection between Kreweras intervals and the (unique) realizers of stack triangulations. Thus, Φ induces a bijection between Kreweras intervals and stacktriangulations which are known to be in bijection with ternary trees.