872 resultados para Hydrologic Modeling Catchment and Runoff Computations
Resumo:
The field of Semantic Web Services (SWS) has been recognized as one of the most promising areas of emergent research within the Semantic Web (SW) initiative, exhibiting an extensive commercial potential, and attracting significant attention from both industry and the research community. Currently, there exist several different frameworks and languages for formally describing a Web Service: OWL-S (Web Ontology Language for Services), WSMO (Web Service Modeling Ontology) and SAWSDL (Semantic Annotations for the Web Services Description Language) are the most important approaches. To the inexperienced user, choosing the appropriate paradigm for a specific SWS application may prove to be challenging, given a lack of clear separation between the ideas promoted by the associated research communities. In this paper, we systematically compare OWL-S, WSMO and SAWSDL from various standpoints, namely that of the service requester and provider as well as the broker based view. The comparison is meant to help users to better understand the strengths and limitations of these different approaches to formalising SWS, and to choose the most suitable solution for a given use case. © 2013 IEEE.
Resumo:
Heuristics, simulation, artificial intelligence techniques and combinations thereof have all been employed in the attempt to make computer systems adaptive, context-aware, reconfigurable and self-managing. This paper complements such efforts by exploring the possibility to achieve runtime adaptiveness using mathematically-based techniques from the area of formal methods. It is argued that formal methods @ runtime represents a feasible approach, and promising preliminary results are summarised to support this viewpoint. The survey of existing approaches to employing formal methods at runtime is accompanied by a discussion of their challenges and of the future research required to overcome them. © 2011 Springer-Verlag.
Resumo:
This paper examines the methodological aspect of climate change, particularly the aggregation of costs and benefits induced by climate change on individuals, societies, economies and on the whole ecosystem. Assessing the total and/or marginal costs of environmental change is difficult because of wide range of factors that have to be involved. The subsequent study tries to capture the complexity of cost assessment on climate change therefore includes several critical factors such as scenarios and modeling, valuation and estimation, equity and discounting.
Resumo:
Since the Morris worm was released in 1988, Internet worms continue to be one of top security threats. For example, the Conficker worm infected 9 to 15 million machines in early 2009 and shut down the service of some critical government and medical networks. Moreover, it constructed a massive peer-to-peer (P2P) botnet. Botnets are zombie networks controlled by attackers setting out coordinated attacks. In recent years, botnets have become the number one threat to the Internet. The objective of this research is to characterize spatial-temporal infection structures of Internet worms, and apply the observations to study P2P-based botnets formed by worm infection. First, we infer temporal characteristics of the Internet worm infection structure, i.e., the host infection time and the worm infection sequence, and thus pinpoint patient zero or initially infected hosts. Specifically, we apply statistical estimation techniques on Darknet observations. We show analytically and empirically that our proposed estimators can significantly improve the inference accuracy. Second, we reveal two key spatial characteristics of the Internet worm infection structure, i.e., the number of children and the generation of the underlying tree topology formed by worm infection. Specifically, we apply probabilistic modeling methods and a sequential growth model. We show analytically and empirically that the number of children has asymptotically a geometric distribution with parameter 0.5, and the generation follows closely a Poisson distribution. Finally, we evaluate bot detection strategies and effects of user defenses in P2P-based botnets formed by worm infection. Specifically, we apply the observations of the number of children and demonstrate analytically and empirically that targeted detection that focuses on the nodes with the largest number of children is an efficient way to expose bots. However, we also point out that future botnets may self-stop scanning to weaken targeted detection, without greatly slowing down the speed of worm infection. We then extend the worm spatial infection structure and show empirically that user defenses, e.g. , patching or cleaning, can significantly mitigate the robustness and the effectiveness of P2P-based botnets. To counterattack, we evaluate a simple measure by future botnets that enhances topology robustness through worm re-infection.
Resumo:
Historic changes in water-use management in the Florida Everglades have caused the quantity of freshwater inflow to Florida Bay to decline by approximately 60% while altering its timing and spatial distribution. Two consequences have been (1) increased salinity throughout the bay, including occurrences of hypersalinity, coupled with a decrease in salinity variability, and (2) change in benthic habitat structure. Restoration goals have been proposed to return the salinity climates (salinity and its variability) of Florida Bay to more estuarine conditions through changes in upstream water management, thereby returning seagrass species cover to a more historic state. To assess the potential for meeting those goals, we used two modeling approaches and long-term monitoring data. First, we applied the hydrological mass balance model FATHOM to predict salinity climate changes in sub-basins throughout the bay in response to a broad range of freshwater inflow from the Everglades. Second, because seagrass species exhibit different sensitivities to salinity climates, we used the FATHOM-modeled salinity climates as input to a statistical discriminant function model that associates eight seagrass community types with water quality variables including salinity, salinity variability, total organic carbon, total phosphorus, nitrate, and ammonium, as well as sediment depth and light reaching the benthos. Salinity climates in the western sub-basins bordering the Gulf of Mexico were insensitive to even the largest (5-fold) modeled increases in freshwater inflow. However, the north, northeastern, and eastern sub-basins were highly sensitive to freshwater inflow and responded to comparatively small increases with decreased salinity and increased salinity variability. The discriminant function model predicted increased occurrences ofHalodule wrightii communities and decreased occurrences of Thalassia testudinum communities in response to the more estuarine salinity climates. The shift in community composition represents a return to the historically observed state and suggests that restoration goals for Florida Bay can be achieved through restoration of freshwater inflow from the Everglades.
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity.^ We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. ^ This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.^
Resumo:
Fire, which affects community structure and composition at all trophic levels, is an integral component of the Everglades ecosystem (Wade et al. 1980; Lockwood et al. 2003). Without fire, the Everglades as we know it today would be a much different place. This is particularly true for the short-hydroperiod marl prairies that predominate on the eastern and western flanks of Shark River Slough, Everglades National Park (Figure 1). In general, fire in a tropical or sub-tropical grassland community favors the dominance of C4 grasses over C3 species (Roscoe et al. 2000; Briggs et al. 2005). Within this pyrogenic graminoid community also, periodic natural fires, together with suitable hydrologic regime, maintain and advance the dominance of C4 vs C3 graminoids (Sah et al. 2008), and suppress the encroachment of woody stems (Hanan et al. 2009; Hanan et al. unpublished manuscript) originating from the tree islands that, in places, dominate the landscape within this community. However, fires, under drought conditions and elevated fuel loads, can spread quickly throughout the landscape, oxidizing organic soils, both in the prairie and in the tree islands, and, in the process, lead to shifts in vegetation composition. This is particularly true when a fire immediately precedes a flood event (Herndon et al. 1991; Lodge 2005; Sah et al. 2010), or if so much soil is consumed during the fire that the hydrologic regime is permanently altered as a result of a decrease in elevation (Zaffke 1983).
Resumo:
Deforestation in the tropical Andes is affecting ecological conditions of streams, and determination of how much forest should be retained is a pressing task for conservation, restoration and management strategies. We calculated and analyzed eight benthic metrics (structural, compositional and water quality indices) and a physical-chemical composite index with gradients of vegetation cover to assess the effects of deforestation on macroinvertebrate communities and water quality of 23 streams in southern Ecuadorian Andes. Using a geographical information system (GIS), we quantified vegetation cover at three spatial scales: the entire catchment, the riparian buffer of 30 m width extending the entire stream length, and the local scale defined for a stream reach of 100 m in length and similar buffer width. Macroinvertebrate and water quality metrics had the strongest relationships with vegetation cover at catchment and riparian scales, while vegetation cover did not show any association with the macroinvertebrate metrics at local scale. At catchment scale, the water quality metrics indicate that ecological condition of Andean streams is good when vegetation cover is over 70%. Further, macroinvertebrate community assemblages were more diverse and related in catchments largely covered by native vegetation (>70%). Overall, our results suggest that retaining an important quantity of native vegetation cover within the catchments and a linkage between headwater and riparian forests help to maintain and improve stream biodiversity and water quality in Andean streams affected by deforestation. Also, this research proposes that a strong regulation focused to the management of riparian buffers can be successful when decision making is addressed to conservation/restoration of Andean catchments.
Resumo:
The developed study proposes a new computer modeling efficient and easy to apply in usual project situations to evaluate the interaction between masonry panels and support structure. The proposed model simulates the behavior of the wall exclusively using frame finite elements, thus compounding an equivalent frame. The validation was performed in two ways: firstly, through the analysis of various panels of generic plans, comparing the results obtained from equivalent frame model with the ones from a reference model, which uses shell finite elements in discretization of the walls; and in a second step, comparing with the results of the experimental model of Rosenhaupt. The analyzes considered the linear elastic behavior for materials and consisted basically in the evaluation of vertical displacements and efforts in support beams, and tensions at the base of walls. Was also evaluated, from flat and threedimensional modeling of some walls from a real project, important aspects of the wall-beam interaction, e.g.: the presence of openings of doors and windows, arranged in any position; conditions of support and linking of beams; interference of moorings between walls; and consideration of wind action. The analysis of the achieved results demonstrated the efficiency of the proposed modeling, since they have very similar aspects in the distribution of stresses and efforts, always with intensities slightly larger than those of the reference and experimental models.
Resumo:
Climate and air pollution, among others, are responsible factors for increase of health vulnerability of the populations that live in urban centers. Climate changes combined with high concentrations of atmospheric pollutants are usually associated with respiratory and cardiovascular diseases. In this sense, the main objective of this research is to model in different ways the climate and health relation, specifically for the children and elderly population which live in São Paulo. Therefore, data of meteorological variables, air pollutants, hospitalizations and deaths from respiratory and cardiovascular diseases a in 11-year period (2000-2010) were used. By using modeling via generalized estimating equations, the relative risk was obtained. By dynamic regression, it was possible to predict the number of deaths through the atmospheric variables and the betabinomial-poisson model was able to estimate the number of deaths and simulate scenarios. The results showed that the risk of hospitalizations due to asthma increases approximately twice for children exposed to high concentrations of particulate matter than children who are not exposed. The risk of death by acute myocardial infarction in elderly increase in 3%, 6%, 4% and 9% due to high concentrations CO, SO2, O3 and PM10, respectively. Regarding the dynamic regression modeling, the results showed that deaths by respiratory diseases can be predicted consistently. The beta-binomial-poisson model was able to reproduce an average number of deaths by heart insufficiency. In the region of Santo Amaro the observed number was 2.462 and the simulated was 2.508, in the Sé region 4.308 were observed and 4.426 simulated, which allowed for the generation of scenarios that may be used as a parameter for decision. Making with these results, it is possible to contribute for methodologies that can improve the understanding of the relation between climate and health and proved support to managers in environmental planning and public health policies.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
Many buildings constructed during the middle of the 20th century were constructed with criteria that fall short of current requirements. Although shortcomings are possible in all aspects of the design, the inadequacies in terms of seismic design present a more pressing issue to human life. This risk has been seen in various earthquakes that have struck Italy recently, and subsequently, the codes have been altered to account for this underestimated danger. Structures built after these changes remain at risk and must be retrofitted depending on their use. This report centers around the Giovanni Michelucci Institute of Mathematics at the University of Bologna and the work required to modify the building so that it can withstand 60% of the current design requirements. The goal of this particular report is to verify the previous reports written in Italian and present an accurate analysis along with intervention suggestions for this particular building. The work began with an investigation into the previous sources and work to find out how the structure had been interpreted. After understanding the building, corrections were made where required, and the failing elements were organized graphically to more easily show where the building needed the most work. Once the critical zones were mapped, remediation techniques were tested on the top floor, and the modeling techniques and effects of the interventions were presented to assist in further work on the structure.
Resumo:
This study describes the development of a prototype to evaluate the potential of environments based on two-dimensional modeling and virtual reality as power substations learning objects into training environments from a central operation and control of power utility Cemig. Initially, there was an identification modeling features and cognitive processes in 2D and RV, from which it was possible to create frames that serve to guide the preparation of a checklist with assigning a metric weight for measuring cognitive potential learning in the study sites. From these contents twenty-four questions were prepared and each was assigned a weight that was used in the calculation of the metric; the questions were grouped into skill sets and similar cognitive processes called categories. Were then developed two distinct environments: the first, the prototype features an interactive checklist and your individual results. And, second, a system of data management environment for the configuration and editing of the prototype, and the observation and analysis of the survey results. For prototype validation, were invited to access the virtual checklist and answer it, five professionals linked to Cemig's training area. The results confirmed the validity of this instrument application to assess the possible potential of modeling in 2D and RV as learning objects in power substations, as well as provide feedback to developers of virtual environments to improve the system.
Resumo:
This paper makes a comparative study of two Soft Single Switched Quadratic Boost Converters (SSS1 and SSS2) focused on Maximum Power Point Tracking (MPPT) of a PV array using Perturb and Observe (P&O) algorithm. The proposed converters maintain the static gain characteristics and dynamics of the original converter with the advantage of considerably reducing the switching losses and Electromagnetic Interference (EMI). It is displayed the input voltage Quadratic Boost converter modeling; qualitative and quantitative analysis of soft switching converters, defining the operation principles, main waveforms, time intervals and the state variables in each operation steps, phase planes of resonant elements, static voltage gain expressions, analysis of voltage and current efforts in semiconductors and the operational curves at 200 W to 800 W. There are presented project of PI, PID and PID + Notch compensators for MPPT closed-loop system and resonant elements design. In order to analyze the operation of a complete photovoltaic system connected to the grid, it was chosen to simulate a three-phase inverter using the P-Q control theory of three-phase instantaneous power. Finally, the simulation results and experimental with the necessary comparative analysis of the proposed converters will be presented.