62 resultados para Deluge


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Arts education research, as an interdisciplinary field, has developed in the shadows of a number of research traditions. However amid all the methodological innovation, I believe there is one particular, distinctive and radical research strategy which arts educators have created to research the practice of arts education: namely arts-based research. For many, and Elliot Eisner from Stanford University was among the first, arts education needed a research approach which could deal with the complex dynamics of arts education in the classroom. What was needed was ‘an approach to the conduct of educational research that was rooted in the arts and that used aesthetically crafted forms to reveal aspects of practice that mattered educationally’ (Eisner 2006: 11). While arts education researchers were crafting the principles and practices of arts-based research, fellow artist/researchers in the creative arts were addressing similar needs and fashioning their own exacting research strategies. This chapter aligns arts-based research with the complementary research practices established in creative arts studios and identifies the shared and truly radical nature of these moves. Finally, and in a contemporary turn many will find surprising, I will discuss how the radical aspects of these methodologies are now being held up as core elements of what is being called the fourth paradigm of scientific research, known as eScience. Could it be that the radical dynamics of arts-based research pre-figured the needs of eScience researchers who are currently struggling to manage the ‘deluge of Big Data’ which is disrupting their well-established scientific methods?

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This practice-led research investigated the negotiation processes informing effective models of transcultural collaboration. In a creative project interweaving the image-based physicality of the Japanese dance form of butoh with the traditional Korean vocal style of p'ansori, a series of creative development cycles were undertaken with a team of artists from Australia and Korea, culminating in Deluge, a work of physical theatre. The development of interventions at 'sites of transcultural potential' resulted in improvements to the negotiation of interpersonal relationships and assisted in the emergence of a productive working environment in transculturally collaborative artistic practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The daytime composition and relative abundance of zooplankton species were studied in three treatments of two replicate earthen ponds each with nutrient sources and water replenishment regimes. Treatment -A (200m super(2) surface area supplied 900kgha super(-1) pig manure only). Treatment -B (200m super(2) surface area supplied 70kgha super(-1) month super(-1) pig manure, 50kgha super(-1) month super(-1) N.P.K. [15:15:15] and 30kgha super(-1) month super(-1) Urea) and Treatment-C (1500m2 surface area supplied 1150kgha super(-1) month super(-1) commercial grade 40% crude protein compounded feed). Water replenishment for Treatment A was daily tidal deluge from the New Calabar River while that for treatment B and C was from column-well and occasional rains. No zooplankton species were recovered from the pig-manure only treatment (A) while only Diffugia constricta and Difflugia urceolata were the two protozoans that occurred together in treatments B (combined fertilization) and C (compounded feed only) in contrast, Difflugia acuminate and three rotifers, Collurella uncinata, Diurella stylata and Keratella quadrata occurred only treatment B. similarly, Arcella arenaria, Arcella costata, Centropyxis aculeate, Difflugia pyriformis, Branchionus calyciflorus, Lepadella patella, Polyarthra trigla and Onchocanmptus mohammedi were recovered from treatment C. Arcella costata was the most abundant zooplankton in the entire experiment, while Arcella arenaria was very abundant in treatment C, Collurella uncinata was very abundant in treatment B. The inference is that combined fertilization of earthen freshwater ponds tend to be more suitable for the culture of rotifers such as Brachionus calyciflorus, popular in fish larva nursery, while those supplied compounded feed could be used to produce protozoans where desirable

Relevância:

10.00% 10.00%

Publicador:

Resumo:

EXTRACT (SEE PDF FOR FULL ABSTRACT): There were many similarities between the February 1986 storm and that of December 1964 and also December 1955. The 1964 storm hit hardest a little further north and the North Coast took the brunt of that storm. December 1955 also produced higher north coastal area runoff. December 1955 produced greater peaks in the central part of the state than the 1964 flood and is perhaps more comparable south of the Lake Tahoe-American River area. But the real surprise this time was the volume. Four reservoirs, Folsom, Black Butte, Pardee, and Comanche, were filled completely and became surcharged (storing more water than the designed capacity). The 10 day total rainfall amounted to half the normal annual totals at many precipitation stations. The February 1986 flood is a vivid reminder of the extremes of California climate and the value of the extensive system of flood control works in the state. Before the storm, especially in January, there was much concern about the dryness of the water year. Then with the deluge, California's flood control systems were tested. By and large the system worked preventing untold damage and misery for most dwellers in the flat lands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern Engineering Design involves the deployment of many computational tools. Re- search on challenging real-world design problems is focused on developing improvements for the engineering design process through the integration and application of advanced com- putational search/optimization and analysis tools. Successful application of these methods generates vast quantities of data on potential optimum designs. To gain maximum value from the optimization process, designers need to visualise and interpret this information leading to better understanding of the complex and multimodal relations between param- eters, objectives and decision-making of multiple and strongly conflicting criteria. Initial work by the authors has identified that the Parallel Coordinates interactive visualisation method has considerable potential in this regard. This methodology involves significant levels of user-interaction, making the engineering designer central to the process, rather than the passive recipient of a deluge of pre-formatted information. In the present work we have applied and demonstrated this methodology in two differ- ent aerodynamic turbomachinery design cases; a detailed 3D shape design for compressor blades, and a preliminary mean-line design for the whole compressor core. The first case comprises 26 design parameters for the parameterisation of the blade geometry, and we analysed the data produced from a three-objective optimization study, thus describing a design space with 29 dimensions. The latter case comprises 45 design parameters and two objective functions, hence developing a design space with 47 dimensions. In both cases the dimensionality can be managed quite easily in Parallel Coordinates space, and most importantly, we are able to identify interesting and crucial aspects of the relationships between the design parameters and optimum level of the objective functions under con- sideration. These findings guide the human designer to find answers to questions that could not even be addressed before. In this way, understanding the design leads to more intelligent decision-making and design space exploration. © 2012 AIAA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper describes the development of a novel metaheuristic that combines an electromagnetic-like mechanism (EM) and the great deluge algorithm (GD) for the University course timetabling problem. This well-known timetabling problem assigns lectures to specific numbers of timeslots and rooms maximizing the overall quality of the timetable while taking various constraints into account. EM is a population-based stochastic global optimization algorithm that is based on the theory of physics, simulating attraction and repulsion of sample points in moving toward optimality. GD is a local search procedure that allows worse solutions to be accepted based on some given upper boundary or ‘level’. In this paper, the dynamic force calculated from the attraction-repulsion mechanism is used as a decreasing rate to update the ‘level’ within the search process. The proposed method has been applied to a range of benchmark university course timetabling test problems from the literature. Moreover, the viability of the method has been tested by comparing its results with other reported results from the literature, demonstrating that the method is able to produce improved solutions to those currently published. We believe this is due to the combination of both approaches and the ability of the resultant algorithm to converge all solutions at every search process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increasing adoption of cloud computing, social networking, mobile and big data technologies provide challenges and opportunities for both research and practice. Researchers face a deluge of data generated by social network platforms which is further exacerbated by the co-mingling of social network platforms and the emerging Internet of Everything. While the topicality of big data and social media increases, there is a lack of conceptual tools in the literature to help researchers approach, structure and codify knowledge from social media big data in diverse subject matter domains, many of whom are from nontechnical disciplines. Researchers do not have a general-purpose scaffold to make sense of the data and the complex web of relationships between entities, social networks, social platforms and other third party databases, systems and objects. This is further complicated when spatio-temporal data is introduced. Based on practical experience of working with social media datasets and existing literature, we propose a general research framework for social media research using big data. Such a framework assists researchers in placing their contributions in an overall context, focusing their research efforts and building the body of knowledge in a given discipline area using social media data in a consistent and coherent manner.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the application of an automated hybrid approach in addressing the university timetabling problem. The approach described is based on the nature-inspired artificial bee colony (ABC) algorithm. An ABC algorithm is a biologically-inspired optimization approach, which has been widely implemented in solving a range of optimization problems in recent years such as job shop scheduling and machine timetabling problems. Although the approach has proven to be robust across a range of problems, it is acknowledged within the literature that there currently exist a number of inefficiencies regarding the exploration and exploitation abilities. These inefficiencies can often lead to a slow convergence speed within the search process. Hence, this paper introduces a variant of the algorithm which utilizes a global best model inspired from particle swarm optimization to enhance the global exploration ability while hybridizing with the great deluge (GD) algorithm in order to improve the local exploitation ability. Using this approach, an effective balance between exploration and exploitation is attained. In addition, a traditional local search approach is incorporated within the GD algorithm with the aim of further enhancing the performance of the overall hybrid method. To evaluate the performance of the proposed approach, two diverse university timetabling datasets are investigated, i.e., Carter's examination timetabling and Socha course timetabling datasets. It should be noted that both problems have differing complexity and different solution landscapes. Experimental results demonstrate that the proposed method is capable of producing high quality solutions across both these benchmark problems, showing a good degree of generality in the approach. Moreover, the proposed method produces best results on some instances as compared with other approaches presented in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Generating timetables for an institution is a challenging and time consuming task due to different demands on the overall structure of the timetable. In this paper, a new hybrid method which is a combination of a great deluge and artificial bee colony algorithm (INMGD-ABC) is proposed to address the university timetabling problem. Artificial bee colony algorithm (ABC) is a population based method that has been introduced in recent years and has proven successful in solving various optimization problems effectively. However, as with many search based approaches, there exist weaknesses in the exploration and exploitation abilities which tend to induce slow convergence of the overall search process. Therefore, hybridization is proposed to compensate for the identified weaknesses of the ABC. Also, inspired from imperialist competitive algorithms, an assimilation policy is implemented in order to improve the global exploration ability of the ABC algorithm. In addition, Nelder–Mead simplex search method is incorporated within the great deluge algorithm (NMGD) with the aim of enhancing the exploitation ability of the hybrid method in fine-tuning the problem search region. The proposed method is tested on two differing benchmark datasets i.e. examination and course timetabling datasets. A statistical analysis t-test has been conducted and shows the performance of the proposed approach as significantly better than basic ABC algorithm. Finally, the experimental results are compared against state-of-the art methods in the literature, with results obtained that are competitive and in certain cases achieving some of the current best results to those in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Contient : 1 ; Table indiquant « le jourde Pasquez, depuis l'an mil IIIC. LX jusques à Pasquez l'an mil IIIIC XXIII » ; Notes chronologiques : « Nota que Paris fut fondée IXC. ans après le deluge... Led. Cesar mourut VIIC. VIII ans après la fondation de Romme et avant l'Incarnation de Nostre Seigeur LIII ans » ; Dictons : « En prinche loyalté, en clert humilité... En drap belle coulour, en vin bonne savour » ; Autres dictons : « Pitié de Lombart, labour de Picart... ces VIII choses, pris bon, ne vallent pas un bouton » ; Vers sur Du Guesclin ; 2 « L'extrait des annéez de la Nativité. D'aucuns pappes de Romme, empereurs, rois de Franche et d'Engleterre, dux de Normandie, et autres princes et grans seigneurs qui ont regné, en quel temps et combien, depuis l'an premier de l'Incarnacion Jhesu Christ, qui nasquit de la vierge Marie, si comme vous pourrés voir par ordre en la manière qui s'ensieut : Premierement. L'an I. Nostre Seigneur Jhesu Christ fu né de la vierge Marie, circonsis, offert au temple et envoié par l'angle en essil en Egipte... Mil IIIC. XVIII. La feste du saint sacrement fu ordenée ou temps du pape Urban Quart » ; 3 Fol. 11 à 13 ; 4 « Chy ensuit aucunes memoirez avenueez à Rouen y (sic) ès partiez d'icellez (sic). Premièrement, samedy XXIe jour de jung, l'an mil IIIC. LXXI l'abbé de Saint Ouein de Rouen fist lever une fourquez de souz son boys de Bihorel, et il firent pendre I. larron... » ; Cette chronique s'arrête à 1424 et finit par : «... et leur fist Diex grace de ne morir » ; 5 Chronique dite normande : « Depuis que Godeffroy de Billon et la baronnie de France orent conquis Anthyoce et Jerusalem... » Derniers mots : «... Or parleron d'un cas advenu à la court de l'eglise à Rouen. L'an de grace 1434, le samedi 20e jour de juin,... Et fu en ce ledit Cochon grandement dommagié, car il ne fu à la court ne ailleurs etc. »

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ce titre est suivi de quatre vers latins que voici : « Ecce recente toga venio, liber undique mundus, Promere quos reges Francia celsa tulit ; Tempora, gesta, locos, sortem, preludia, famam, Ex multis lecta codice claudo brevi » . Au-dessous de ces vers on lit, semble-t-il, Gaguinettus », comme si l'auteur du présent abrégé, se comparant à Gaguin, voulait signifier par ce diminutif « Gaguinettus », qu'il est un petit Gaguin, un abrégé de Gaguin. Puis vient une devise : « Pour esperer mieulx prosperer ». Sur le feuillet 3 v°, en face du feuillet 4 r°, qui contient le titre ci-dessus énoncé, est tracée à l'encre l'image d'une dame ayant un dragon à ses pieds. Elle tient de la main droite un écusson, aux armes de France, et de la gauche un autre écusson où sont représentées 3 tortues, deux en chef et une en pointe. Les feuillets 5-7 sont occupés par une dédicace de l'auteur « Audebrand, advocat d'Aulge et de Honnefleu... à très prudent et ingenieux homme Michelot Feré, grand architecteur du Havre de Grace ». Cette dedicace est datée « de Honnefleu, ce derrain jour de l'an 1518 ». Après la dédicace qui précède, terminée par la devise : « Par esperer mieulx prosperer », s'ouvre le « prologue du present livre », où l'auteur énumère ses autorités : Bérose, Manéthon, Métasthènes, Josèphe, Diodore, Pline, Robert Frescher, Robert Gaguin, Jean Lemaire de Belges. (Fol. 8-10). Le traité intitulé : Les Antiquités de Gaule commence (fol. 11) par : « Ante aquarum cladem... Or dit Berose que devant le deluge... » et finit (fol. 138) par : «... la fondation du Havre de Grace, que le roi François Ier a faict pour la protection de son royaulme... au cap de Caulx ». L'auteur attribue l'honneur d'avoir conduit les travaux à « monseigneur de Chillon, vis-admiral de France et capitaine de Honnefleu », et l'exécution à Michelot Feré, « homme... plain d'ingenieuse solercie, grand maistre et architecteur de l'oeuvre ». Il termine par le souhait que la construction du Havre soit menée à bonne fin, «... priant à Dieu donner au bon roy et à ses amys par esperer bien prosperer ». Puis il ajoute, ledit traité étant divisé en six livres : « La fin du VIe livre des Antiquités de Gaulle et consequemment de tout nostre petit labeur sur la 4e lignée des roys de France, selon la tradition de Gaguin ». Reliure du XVIe siècle en cuir estampé. Sur le premier plat, autour de la Vierge et de S. Jean, au pied de la croix on lit : « Christus filius Dei, vivis miserere nobis ». Sous la croix deux anges soutiennent un écu contenant l'agneau nimbé et portant la croix, surmonté de trois fleurs de lis. Le second plat représente la Pentecôte. Légende : « Veni, sancte spiritus, reple tuorum corda ».

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It became so usual for the east coast of India to face at least IO to 15 cyclones every year, out of which 3 to 4 may reach the deep depression stage. As a result the east coast of India experiences frequent heavy damages of varying intensities due to storm surges and it is also not unusual to experience a calamitous deluge once in a decade or so. Loss of life and damages can be minimized only if the magnitude of the surge could be predicted at least a day in advance. Therefore, an attempt to study the storm surges generated by the cyclones that strike the east coast of India and. suggest a method of predicting them through nomogram is made

Relevância:

10.00% 10.00%

Publicador:

Resumo:

More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Various applications for the purposes of event detection, localization, and monitoring can benefit from the use of wireless sensor networks (WSNs). Wireless sensor networks are generally easy to deploy, with flexible topology and can support diversity of tasks thanks to the large variety of sensors that can be attached to the wireless sensor nodes. To guarantee the efficient operation of such a heterogeneous wireless sensor networks during its lifetime an appropriate management is necessary. Typically, there are three management tasks, namely monitoring, (re) configuration, and code updating. On the one hand, status information, such as battery state and node connectivity, of both the wireless sensor network and the sensor nodes has to be monitored. And on the other hand, sensor nodes have to be (re)configured, e.g., setting the sensing interval. Most importantly, new applications have to be deployed as well as bug fixes have to be applied during the network lifetime. All management tasks have to be performed in a reliable, time- and energy-efficient manner. The ability to disseminate data from one sender to multiple receivers in a reliable, time- and energy-efficient manner is critical for the execution of the management tasks, especially for code updating. Using multicast communication in wireless sensor networks is an efficient way to handle such traffic pattern. Due to the nature of code updates a multicast protocol has to support bulky traffic and endto-end reliability. Further, the limited resources of wireless sensor nodes demand an energy-efficient operation of the multicast protocol. Current data dissemination schemes do not fulfil all of the above requirements. In order to close the gap, we designed the Sensor Node Overlay Multicast (SNOMC) protocol such that to support a reliable, time-efficient and energy-efficient dissemination of data from one sender node to multiple receivers. In contrast to other multicast transport protocols, which do not support reliability mechanisms, SNOMC supports end-to-end reliability using a NACK-based reliability mechanism. The mechanism is simple and easy to implement and can significantly reduce the number of transmissions. It is complemented by a data acknowledgement after successful reception of all data fragments by the receiver nodes. In SNOMC three different caching strategies are integrated for an efficient handling of necessary retransmissions, namely, caching on each intermediate node, caching on branching nodes, or caching only on the sender node. Moreover, an option was included to pro-actively request missing fragments. SNOMC was evaluated both in the OMNeT++ simulator and in our in-house real-world testbed and compared to a number of common data dissemination protocols, such as Flooding, MPR, TinyCubus, PSFQ, and both UDP and TCP. The results showed that SNOMC outperforms the selected protocols in terms of transmission time, number of transmitted packets, and energy-consumption. Moreover, we showed that SNOMC performs well with different underlying MAC protocols, which support different levels of reliability and energy-efficiency. Thus, SNOMC can offer a robust, high-performing solution for the efficient distribution of code updates and management information in a wireless sensor network. To address the three management tasks, in this thesis we developed the Management Architecture for Wireless Sensor Networks (MARWIS). MARWIS is specifically designed for the management of heterogeneous wireless sensor networks. A distinguished feature of its design is the use of wireless mesh nodes as backbone, which enables diverse communication platforms and offloading functionality from the sensor nodes to the mesh nodes. This hierarchical architecture allows for efficient operation of the management tasks, due to the organisation of the sensor nodes into small sub-networks each managed by a mesh node. Furthermore, we developed a intuitive -based graphical user interface, which allows non-expert users to easily perform management tasks in the network. In contrast to other management frameworks, such as Mate, MANNA, TinyCubus, or code dissemination protocols, such as Impala, Trickle, and Deluge, MARWIS offers an integrated solution monitoring, configuration and code updating of sensor nodes. Integration of SNOMC into MARWIS further increases performance efficiency of the management tasks. To our knowledge, our approach is the first one, which offers a combination of a management architecture with an efficient overlay multicast transport protocol. This combination of SNOMC and MARWIS supports reliably, time- and energy-efficient operation of a heterogeneous wireless sensor network.