919 resultados para Storm surges
Resumo:
Internet of Things är ett samlingsbegrepp för den utveckling som innebär att olika typer av enheter kan förses med sensorer och datachip som är uppkopplade mot internet. En ökad mängd data innebär en ökad förfrågan på lösningar som kan lagra, spåra, analysera och bearbeta data. Ett sätt att möta denna förfrågan är att använda sig av molnbaserade realtidsanalystjänster. Multi-tenant och single-tenant är två typer av arkitekturer för molnbaserade realtidsanalystjänster som kan användas för att lösa problemen med hanteringen av de ökade datamängderna. Dessa arkitekturer skiljer sig åt när det gäller komplexitet i utvecklingen. I detta arbete representerar Azure Stream Analytics en multi-tenant arkitektur och HDInsight/Storm representerar en single-tenant arkitektur. För att kunna göra en jämförelse av molnbaserade realtidsanalystjänster med olika arkitekturer, har vi valt att använda oss av användbarhetskriterierna: effektivitet, ändamålsenlighet och användarnöjdhet. Vi kom fram till att vi ville ha svar på följande frågor relaterade till ovannämnda tre användbarhetskriterier: • Vilka likheter och skillnader kan vi se i utvecklingstider? • Kan vi identifiera skillnader i funktionalitet? • Hur upplever utvecklare de olika analystjänsterna? Vi har använt en design and creation strategi för att utveckla två Proof of Concept prototyper och samlat in data genom att använda flera datainsamlingsmetoder. Proof of Concept prototyperna inkluderade två artefakter, en för Azure Stream Analytics och en för HDInsight/Storm. Vi utvärderade dessa genom att utföra fem olika scenarier som var för sig hade 2-5 delmål. Vi simulerade strömmande data genom att låta en applikation kontinuerligt slumpa fram data som vi analyserade med hjälp av de två realtidsanalystjänsterna. Vi har använt oss av observationer för att dokumentera hur vi arbetade med utvecklingen av analystjänsterna samt för att mäta utvecklingstider och identifiera skillnader i funktionalitet. Vi har även använt oss av frågeformulär för att ta reda på vad användare tyckte om analystjänsterna. Vi kom fram till att Azure Stream Analytics initialt var mer användbart än HDInsight/Storm men att skillnaderna minskade efter hand. Azure Stream Analytics var lättare att arbeta med vid simplare analyser medan HDInsight/Storm hade ett bredare val av funktionalitet.
Resumo:
In the UK, urban river basins are particularly vulnerable to flash floods due to short and intense rainfall. This paper presents potential flood resilience approaches for the highly urbanised Wortley Beck river basin, south west of the Leeds city centre. The reach of Wortley Beck is approximately 6km long with contributing catchment area of 30km2 that drain into the River Aire. Lower Wortley has experienced regular flooding over the last few years from a range of sources, including Wortley Beck and surface and ground water, that affects properties both upstream and downstream of Farnley Lake as well as Wortley Ring Road. This has serious implications for society, the environment and economy activity in the City of Leeds. The first stage of the study involves systematically incorporating Wortley Beck’s land scape features on an Arc-GIS platform to identify existing green features in the region. This process also enables the exploration of potential blue green features: green spaces, green roofs, water retention ponds and swales at appropriate locations and connect them with existing green corridors to maximize their productivity. The next stage is involved in developing a detailed 2D urban flood inundation model for the Wortley Beck region using the CityCat model. CityCat is capable to model the effects of permeable/impermeable ground surfaces and buildings/roofs to generate flood depth and velocity maps at 1m caused by design storm events. The final stage of the study is involved in simulation of range of rainfall and flood event scenarios through CityCat model with different blue green features. Installation of other hard engineering individual property protection measures through water butts and flood walls are also incorporated in the CityCat model. This enables an integrated sustainable flood resilience strategy for this region.
Resumo:
Hydrological loss is a vital component in many hydrological models, which are usedin forecasting floods and evaluating water resources for both surface and subsurface flows. Due to the complex and random nature of the rainfall runoff process, hydrological losses are not yet fully understood. Consequently, practitioners often use representative values of the losses for design applications such as rainfall-runoff modelling which has led to inaccurate quantification of water quantities in the resulting applications. The existing hydrological loss models must be revisited and modellers should be encouraged to utilise other available data sets. This study is based on three unregulated catchments situated in Mt. Lofty Ranges of South Australia (SA). The paper focuses on conceptual models for: initial loss (IL), continuing loss (CL) and proportional loss (PL) with rainfall characteristics (total rainfall (TR) and storm duration (D)), and antecedent wetness (AW) conditions. The paper introduces two methods that can be implemented to estimate IL as a function of TR, D and AW. The IL distribution patterns and parameters for the study catchments are determined using multivariate analysis and descriptive statistics. The possibility of generalising the methods and the limitations of this are also discussed. This study will yield improvements to existing loss models and will encourage practitioners to utilise multiple data sets to estimate losses, instead of using hypothetical or representative values to generalise real situations.
Resumo:
Séries de dados de velocidades máximas anuais do vento, classificadas segundo o tipo de tormenta (ventos EPS ou TS) e a direção (octantes), são utilizadas para o ajuste de um modelo baseado em regressão linear múltipla, permitindo a estimativa dos ventos extremos no interior da região definida pelas estações meteorológicas disponíveis. A correlação entre as velocidades do vento e a temperatura do ar durante tormentas também é investigada, bem como o comportamento estatístico das temperaturas durante ventos fortes.
Resumo:
The Internet has taken the world by storm. It has eliminated the barriers of technology, and unlocked the doors to electronic commerce and the 'Virtual Economy'. It has given us a glimpse into the future of 'Business' itself, and it has created a bewildering variety of choices in our personal and professional lives. It has taken on a life of its own, and we are all frantically trying to keep up. Many overwhelmed companies are asking questions like: 'What should our Internet Strategy be?' Or 'How do we put our business on the Internet like everybody else is doing?' or 'How do we use this thing to make money without spending any?'. These questions may seem reasonable on the surface, but they miss the point because they focus on the technologies rather than the core issues of conducting day-to-day business. The Internet can indeed offer fast returns in marketing reach, speed, director consumer sales and so on, and many companies are using it to good advantage, but the highest and best use of any such technology is to support, enhance and even re-invent the fundamentals of general business practice. When the initial excitement is over, and companies gain experience and confidence with the new business models, this larger view will begin to assert itself. Companies will then start to position their 'Internet Strategies' in context of where the business world itself is going over time, and how they can prepare for what is to come. Until now, the business world has been very fragmented, its collective progress limited (in part) by the inability to communicate within and between companies. Now that the technical remedy seems to be at hand and standards are beginning to emerge, we are starting to see a trend toward consolidation, cooperation, and economic synergy. Companies are improving their internal business processes with Intranets, and Electronic Commerce initiatives have sprung up using EDI, the World Wide Web, E-Mail, secure credit card payments and other tools. Companies are using the Internet to talk to each other and to sell their goods and services to the end consumer. Like Berlin, the walls are coming down because they have to. Electronic 'Communities of Common Interest' are beginning to surface, with the goal of supporting and aligning similar industries (such as Government, Insurance, Transportation and Health care) or similar business functions (such as Purchasing, Payments, and Human Resources). As these communities grow and mature, their initial scope will broaden and their spheres of influence will expand. They will begin to overlap into other communities, creating a synergistic effect and reshaping the conduct of business. The business world will undergo a gradual evolution toward globalization, driven by economic imperatives and natural selection in the marketplace, and facilitated by Electronic Commerce and Internet technologies. The business world 'beyond 2000' will have a substantially different look and feel than that which we see today.
Resumo:
O objetivo deste estudo foi analisar a evolução histórica do modelo de gestão adotado na cidade do Rio de Janeiro buscando identificar qual a situação atual praticada em 2013, caracterizando as circunstâncias que levaram a este cenário. Para tal foi realizado um estudo investigativo sobre a evolução dos modelos de gestão da cidade do Rio de Janeiro, pesquisando-se o contexto histórico, administrativo e político temporal. Buscou-se a avaliação do posicionamento governamental da cidade do Rio de Janeiro em consonância com as especificidades que marcaram os modelos de gestão adotados e a herança direta proveniente dos episódios que marcaram sua evolução histórica e da herança indireta proveniente da modernização da administração pública no Brasil, além das janelas de oportunidades advindas com os grandes eventos como a copa do mundo em 2014 e olimpíadas em 2016. A análise foi realizada à luz da teoria de criação do valor público especificamente as ideias de Mark Moore. Para tanto foram realizadas entrevistas com gestores públicos da prefeitura do Rio de Janeiro, assim como foram analisados documentos de domínio público publicados na imprensa oficial e outros disponíveis na internet.
Resumo:
Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (‘light-touch’) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — i.e., by investors who have incentives to interpret information in a biased fashion in a systematic way. Thus, ‘fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in (excessively) ‘friendly-regulated’ and over-liquid financial market to learn from previous mistakes seems rather limited.
Resumo:
Latin America has recently experienced three cycles of capital inflows, the first two ending in major financial crises. The first took place between 1973 and the 1982 ‘debt-crisis’. The second took place between the 1989 ‘Brady bonds’ agreement (and the beginning of the economic reforms and financial liberalisation that followed) and the Argentinian 2001/2002 crisis, and ended up with four major crises (as well as the 1997 one in East Asia) — Mexico (1994), Brazil (1999), and two in Argentina (1995 and 2001/2). Finally, the third inflow-cycle began in 2003 as soon as international financial markets felt reassured by the surprisingly neo-liberal orientation of President Lula’s government; this cycle intensified in 2004 with the beginning of a (purely speculative) commodity price-boom, and actually strengthened after a brief interlude following the 2008 global financial crash — and at the time of writing (mid-2011) this cycle is still unfolding, although already showing considerable signs of distress. The main aim of this paper is to analyse the financial crises resulting from this second cycle (both in LA and in East Asia) from the perspective of Keynesian/ Minskyian/ Kindlebergian financial economics. I will attempt to show that no matter how diversely these newly financially liberalised Developing Countries tried to deal with the absorption problem created by the subsequent surges of inflow (and they did follow different routes), they invariably ended up in a major crisis. As a result (and despite the insistence of mainstream analysis), these financial crises took place mostly due to factors that were intrinsic (or inherent) to the workings of over-liquid and under-regulated financial markets — and as such, they were both fully deserved and fairly predictable. Furthermore, these crises point not just to major market failures, but to a systemic market failure: evidence suggests that these crises were the spontaneous outcome of actions by utility-maximising agents, freely operating in friendly (light-touched) regulated, over-liquid financial markets. That is, these crises are clear examples that financial markets can be driven by buyers who take little notice of underlying values — investors have incentives to interpret information in a biased fashion in a systematic way. ‘Fat tails’ also occurred because under these circumstances there is a high likelihood of self-made disastrous events. In other words, markets are not always right — indeed, in the case of financial markets they can be seriously wrong as a whole. Also, as the recent collapse of ‘MF Global’ indicates, the capacity of ‘utility-maximising’ agents operating in unregulated and over-liquid financial market to learn from previous mistakes seems rather limited.
Resumo:
The production of waste from urban and industrial activities is one of the factors of environmental contamination and has aroused attention of the scientific community, in the sense of its reuse. On the other hand, the city of Salvador/Ba, with approximately 262 channels, responsible for storm water runoff, produces every year, by the intervention of cleaning and clearing channels, a significant volume of sediments (dredged mud), and thus an appropriate methodology for their final destination. This study aims to assess the influence of incorporation of these tailings in arrays of clay for production of interlocked block ceramic, also known as ceramic paver. All the raw materials from the metropolitan region of Salvador (RMS) were characterized by x-ray fluorescence, x-ray diffraction, thermal analysis (TG and TDA), particle size analysis and dilatometry. With the use of statistical experimental planning technique, ternary diagram was defined in the study region and the analyzed formulations. The specimens were prepared with dimensions of 60x20x5mm³, by uniaxial pressing of 30 MPa and after sintering at temperatures of 900°, 1000º and 1100ºC the technological properties were evaluated: linear shrinkage, water absorption, apparent porosity, apparent specifies mass, flexural rupture and module. For the uniaxial compression strength used cylindrical probe body with Ø 50 mm. The standard mass (MP) was prepared with 90% by weight of clay and 10% by weight of Channel sediment (SCP), not being verified significant variations in the properties of the final product. With the incorporation of 10% by weight of manganese residue (PFM) and 10% by weight of the Ceramic waste (RCB) in the mass default, in addition to adjusting the plasticity due to less waste clay content, provided increased linear firing shrinkage, due the significant concentration of K2O, forming liquid phase at low temperature, contributing to decreased porosity and mechanical resistance, being 92,5 MPa maximum compressive strength verified. After extract test leachate and soluble, the piece containing 10% of the PFM, was classified as non-hazardous and inert material according to NBR10004/04 ABNT. The results showed the feasibility on using waste, SCP, RCB and PFM clay mass, at temperatures above 900ºC, paver ceramic production, according to the specifications of the technical standards, so that to exceed the 10% of the PFM, it becomes imperative to conduct studies of environmental impacts
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
In current upbringing production, children are often conceived as rightful subjects, and concrete and singular people, marked by specificities that schools must respect, mainly their personal wholeness, their care and attention needs, as well as their abilities to learn and produce culture. In the educational practices frame, routine is considered to have a definitive roll in time, space and activities structuring, as with actions and relations of subjects involved. In that perspective, this research aims to analyze routines of zero to two years old children in the upbringing context, relating to their childish specificities. Anchored in the qualitative approach, a Case Study was developed, according the procedures of daily routine observation and semi-structured interviews with six nursery teachers of CMEI Centro Municipal de Educação Infantil, Natal-RN, the research field. The data analysis was based in Speech Analysis principles. The teachers utterances regarding routine and it s roll in the frame revealed significances related to control/regulation of actions theirs and students aiming to streamline tasks; learning relative to routine itself, time and school practices. Thus, prospects of discipline and exercise of power of teachers over students surges, reducing their possibilities to participate. These conceptions reflect the daily routine of the kids and their teachers. By analyzing the methods of routine operation in the time/space/activities frame of CMEI, it was possible to perceive its homogenization of actions and rhythms, not only of the group s children, but the whole institution, which creates, many times, a controlling character that contains/prevents children s initiative. However, it was also possible to observe that in routine recesses, when it s relaxed, and other spaces, times and actions are provided, kids have the opportunity to experience and create different ways of action and relation with time, materials, other kids and teachers, being, as such, respected their specificities. We highlight the importance of reflections regarding routine in upbringing context, as to comprehend it s functions and the need for it s construction to take a multiple character that respects the plurality of situations and singularities of children as persons
Resumo:
In northeastern semiarid, seasonality on precipitation temporal distribution, high intensity storm events and inadequate management of native vegetation can promote soil erosion. Vegetation removal causes soil surface exposure, reduces soil water storage capacity and can be the source degradation processes. In this context, this approach aims to analyze water and soil erosion processes on a 250 m2 undisturbed experimental plot with native vegetation, slope 2.5% by using 2006 and 2007 monitoring data. The site was instrumented to monitor rainfall, overland flow runoff and erosion by using a 5 m³ tank downstream the plot. Soil erosion monitoring was made by transported sediment and organic matter collection after each event. Field infiltration experiments were made at 16 points randomly distributed within the plot area by using a constant head infiltrometer during drought and rainy seasons, respectively. Infiltration data revealed high spatial and temporal variability. It was observed that during the beginning of the rainy period, 77% of the events showed runoff coefficient less than 0.05. As the rainy season began, soil water increase produced annual species germination. High intensity storms resulted in runoff coefficients varying between 0.33 and 0.42. Once the annual species was established, it was observed that approximately 39% of the events produced no runoff, which reflects an increase on soil water retention capacity caused by the vegetation. A gradual runoff reduction during the rainy season emphasizes the effect of vegetative density increase. Soil erosion observed data allowed to fit an empirical relationship involving soil loss and precipitation height, which was used to analyze the plot installation impact on soil erosion. Observed soil loss in 2006 and 2007 was 230 Kg/ha and 54 Kg/ha, respectively
Resumo:
The semiarid rainfall regime is northeastern Brazil is highly variable. Climate processes associated with rainfall are complex and their effects may represent extreme situations of drought or floods, which can have adverse effects on society and the environment. The regional economy has a significant agricultural component, which is strongly influenced by weather conditions. Maximum precipitation analysis is traditionally performed using the intensity-duration-frequency (IDF) probabilistic approach. Results from such analysis are typically used in engineering projects involving hydraulic structures such as drainage network systems and road structures. On the other hand, precipitation data analysis may require the adoption of some kind of event identification criteria. The minimum inter-event duration (IMEE) is one of the most used criteria. This study aims to analyze the effect of the IMEE on the obtained rain event properties. For this purpose, a nine-year precipitation time series (2002- 2011) was used. This data was obtained from an automatic raingauge station, installed in an environmentally protected area, Ecological Seridó Station. The results showed that adopted IMEE values has an important effect on the number of events, duration, event height, mean rainfall rate and mean inter-event duration. Furthermore, a higher occurrence of extreme events was observed for small IMEE values. Most events showed average rainfall intensity higher than 2 mm.h-1 regardless of IMEE. The storm coefficient of advance was, in most cases, within the first quartile of the event, regardless of the IMEE value. Time series analysis using partial time series made it possible to adjust the IDF equations to local characteristics
Resumo:
The increase of capacity to integrate transistors permitted to develop completed systems, with several components, in single chip, they are called SoC (System-on-Chip). However, the interconnection subsystem cans influence the scalability of SoCs, like buses, or can be an ad hoc solution, like bus hierarchy. Thus, the ideal interconnection subsystem to SoCs is the Network-on-Chip (NoC). The NoCs permit to use simultaneous point-to-point channels between components and they can be reused in other projects. However, the NoCs can raise the complexity of project, the area in chip and the dissipated power. Thus, it is necessary or to modify the way how to use them or to change the development paradigm. Thus, a system based on NoC is proposed, where the applications are described through packages and performed in each router between source and destination, without traditional processors. To perform applications, independent of number of instructions and of the NoC dimensions, it was developed the spiral complement algorithm, which finds other destination until all instructions has been performed. Therefore, the objective is to study the viability of development that system, denominated IPNoSys system. In this study, it was developed a tool in SystemC, using accurate cycle, to simulate the system that performs applications, which was implemented in a package description language, also developed to this study. Through the simulation tool, several result were obtained that could be used to evaluate the system performance. The methodology used to describe the application corresponds to transform the high level application in data-flow graph that become one or more packages. This methodology was used in three applications: a counter, DCT-2D and float add. The counter was used to evaluate a deadlock solution and to perform parallel application. The DCT was used to compare to STORM platform. Finally, the float add aimed to evaluate the efficiency of the software routine to perform a unimplemented hardware instruction. The results from simulation confirm the viability of development of IPNoSys system. They showed that is possible to perform application described in packages, sequentially or parallelly, without interruptions caused by deadlock, and also showed that the execution time of IPNoSys is more efficient than the STORM platform