114 resultados para NEOTROPICAL STREAMS
Resumo:
Big Data presents many challenges related to volume, whether one is interested in studying past datasets or, even more problematically, attempting to work with live streams of data. The most obvious challenge, in a ‘noisy’ environment such as contemporary social media, is to collect the pertinent information; be that information for a specific study, tweets which can inform emergency services or other responders to an ongoing crisis, or give an advantage to those involved in prediction markets. Often, such a process is iterative, with keywords and hashtags changing with the passage of time, and both collection and analytic methodologies need to be continually adapted to respond to this changing information. While many of the data sets collected and analyzed are preformed, that is they are built around a particular keyword, hashtag, or set of authors, they still contain a large volume of information, much of which is unnecessary for the current purpose and/or potentially useful for future projects. Accordingly, this panel considers methods for separating and combining data to optimize big data research and report findings to stakeholders. The first paper considers possible coding mechanisms for incoming tweets during a crisis, taking a large stream of incoming tweets and selecting which of those need to be immediately placed in front of responders, for manual filtering and possible action. The paper suggests two solutions for this, content analysis and user profiling. In the former case, aspects of the tweet are assigned a score to assess its likely relationship to the topic at hand, and the urgency of the information, whilst the latter attempts to identify those users who are either serving as amplifiers of information or are known as an authoritative source. Through these techniques, the information contained in a large dataset could be filtered down to match the expected capacity of emergency responders, and knowledge as to the core keywords or hashtags relating to the current event is constantly refined for future data collection. The second paper is also concerned with identifying significant tweets, but in this case tweets relevant to particular prediction market; tennis betting. As increasing numbers of professional sports men and women create Twitter accounts to communicate with their fans, information is being shared regarding injuries, form and emotions which have the potential to impact on future results. As has already been demonstrated with leading US sports, such information is extremely valuable. Tennis, as with American Football (NFL) and Baseball (MLB) has paid subscription services which manually filter incoming news sources, including tweets, for information valuable to gamblers, gambling operators, and fantasy sports players. However, whilst such services are still niche operations, much of the value of information is lost by the time it reaches one of these services. The paper thus considers how information could be filtered from twitter user lists and hash tag or keyword monitoring, assessing the value of the source, information, and the prediction markets to which it may relate. The third paper examines methods for collecting Twitter data and following changes in an ongoing, dynamic social movement, such as the Occupy Wall Street movement. It involves the development of technical infrastructure to collect and make the tweets available for exploration and analysis. A strategy to respond to changes in the social movement is also required or the resulting tweets will only reflect the discussions and strategies the movement used at the time the keyword list is created — in a way, keyword creation is part strategy and part art. In this paper we describe strategies for the creation of a social media archive, specifically tweets related to the Occupy Wall Street movement, and methods for continuing to adapt data collection strategies as the movement’s presence in Twitter changes over time. We also discuss the opportunities and methods to extract data smaller slices of data from an archive of social media data to support a multitude of research projects in multiple fields of study. The common theme amongst these papers is that of constructing a data set, filtering it for a specific purpose, and then using the resulting information to aid in future data collection. The intention is that through the papers presented, and subsequent discussion, the panel will inform the wider research community not only on the objectives and limitations of data collection, live analytics, and filtering, but also on current and in-development methodologies that could be adopted by those working with such datasets, and how such approaches could be customized depending on the project stakeholders.
Resumo:
Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22–103 km the user receiver positioning results, with various schemes, show an accuracy improvement of the proposed station-augmented PPP and ambiguity-fixed PPP solutions with respect to the standard float PPP solutions without station augmentation and ambiguity resolutions. Overall, the proposed reference station-based GNSS computing mode can support PPP and RTK positioning services as a simpler alternative to the existing network-based RTK or regionally augmented PPP systems.
Resumo:
The Food and Nutrition stream of Australasian Child and Adolescent Obesity Research Network (ACAORN) aims to improve the quality of dietary methodologies and the reporting of dietary intake within Australasian child obesity research (http://www.acaorn.org.au/streams/nutrition/). With 2012 marking ACAORN’s 10th anniversary, this commentary profiles a selection of child obesity nutrition research published over the last decade by Food and Nutrition Stream members. In addition, stream activities have included the development of an online selection guide to assist researchers in their selection of appropriate dietary intake methodologies (http://www.acaorn.org.au/streams/nutrition/dietary-intake/index.php). The quantity and quality of research to guide effective child obesity prevention and treatment has increased substantially over the last decade. ACAORN provides a successful case study of how research networks can provide a collegial atmosphere to foster and co-ordinate research efforts in an otherwise competitive environment.
Resumo:
In this paper, we present a field trial of a pervasive system called Panorama that is aimed at supporting social awareness in work environments. Panorama is an intelligent situated display in the staff room of an academic department. It artistically represents non-critical user generated content such as images from holidays, conferences and other social gatherings, as well as textual messages on its display. It also captures images and videos from different public spaces of the department and streams them onto the Panorama screen, using appropriate abstraction techniques. We studied the use of Panorama for two weeks and observed how Panorama affected staff members' social awareness and community building. We report that Panorama simulated curiosity and learning, initiated new interactions and provided a mechanism for cherishing old memories.
Resumo:
The 48 hour game making challenge has been running since 2007. In recent years, we have not only been running a 'game jam' for the local community but we have also been exploring the way in which the event itself and the place of the event has the potential to create its own stories. The 2014 challenge is part of a series of data collection opportunities focussed on the game jam itself and the meaning making that the participants engage in about the event. We continued the data collection commenced in 2012: "Game jams are the creative festivals of the game development community and a game jam is very much an event or performance; its stories are those of subjective experience. Participants return year after year and recount personal stories from previous challenges; arrival in the 48hr location typically inspires instances of individual memory and narration more in keeping with those of a music festival or an oft frequented holiday destination. Since its inception, the 48hr has been heavily documented, from the photo-blogging of our first jam and the twitter streams of more recent events to more formal interviews and documentaries (see Anderson, 2012). We have even had our own moments of Gonzo journalism with an on-site press room one year and an ‘embedded’ journalist another year (Keogh, 2011). In the last two years of the 48hr we have started to explore ways and means to collect more abstract data during the event, that is, empirical data about movement and activity. The intent behind this form of data collection was to explore graphic and computer generated visualisations of the event, not for the purpose of formal analysis but in the service of further story telling." [exerpt from truna aka j.turner, Thomas & Owen, 2013) See: truna aka j.turner, Thomas & Owen (2013) Living the indie life: mapping creative teams in a 48 hour game jam and playing with data, Proceedings of the 9th Australasian Conference on Interactive Entertainment, IE'2013, September 30 - October 01 2013, Melbourne, VIC, Australia
Resumo:
Fisheries and aquaculture are important for food security, income generation and are critical to long term sustainability of many countries. Freshwater prawns have been harvested in the streams and creeks in Vanuatu, however due to over-exploitation catches have declined in recent years. To satisfy high demand for this product, Vanuatu government intends to establish economically viable small-scale aquaculture industries. The current project showed that wild Macrobrachium lar in Vanuatu constitute a single population for management purposes and that M. rosenbergii grows much faster than M. lar in simple pond grow-out systems, hence is a better species for culture in Vanuatu.
Resumo:
An important responsibility of the Environment Protection Authority, Victoria, is to set objectives for levels of environmental contaminants. To support the development of environmental objectives for water quality, a need has been identified to understand the dual impacts of concentration and duration of a contaminant on biota in freshwater streams. For suspended solids contamination, information reported by Newcombe and Jensen [ North American Journal of Fisheries Management , 16(4):693--727, 1996] study of freshwater fish and the daily suspended solids data from the United States Geological Survey stream monitoring network is utilised. The study group was requested to examine both the utility of the Newcombe and Jensen and the USA data, as well as the formulation of a procedure for use by the Environment Protection Authority Victoria that takes concentration and duration of harmful episodes into account when assessing water quality. The extent to which the impact of a toxic event on fish health could be modelled deterministically was also considered. It was found that concentration and exposure duration were the main compounding factors on the severity of effects of suspended solids on freshwater fish. A protocol for assessing the cumulative effect on fish health and a simple deterministic model, based on the biology of gill harm and recovery, was proposed. References D. W. T. Au, C. A. Pollino, R. S. S Wu, P. K. S. Shin, S. T. F. Lau, and J. Y. M. Tang. Chronic effects of suspended solids on gill structure, osmoregulation, growth, and triiodothyronine in juvenile green grouper epinephelus coioides . Marine Ecology Press Series , 266:255--264, 2004. J.C. Bezdek, S.K. Chuah, and D. Leep. Generalized k-nearest neighbor rules. Fuzzy Sets and Systems , 18:237--26, 1986. E. T. Champagne, K. L. Bett-Garber, A. M. McClung, and C. Bergman. {Sensory characteristics of diverse rice cultivars as influenced by genetic and environmental factors}. Cereal Chem. , {81}:{237--243}, {2004}. S. G. Cheung and P. K. S. Shin. Size effects of suspended particles on gill damage in green-lipped mussel perna viridis. Marine Pollution Bulletin , 51(8--12):801--810, 2005. D. H. Evans. The fish gill: site of action and model for toxic effects of environmental pollutants. Environmental Health Perspectives , 71:44--58, 1987. G. C. Grigg. The failure of oxygen transport in a fish at low levels of ambient oxygen. Comp. Biochem. Physiol. , 29:1253--1257, 1969. G. Holmes, A. Donkin, and I.H. Witten. {Weka: A machine learning workbench}. In Proceedings of the Second Australia and New Zealand Conference on Intelligent Information Systems , volume {24}, pages {357--361}, {Brisbane, Australia}, {1994}. {IEEE Computer Society}. D. D. Macdonald and C. P. Newcombe. Utility of the stress index for predicting suspended sediment effects: response to comments. North American Journal of Fisheries Management , 13:873--876, 1993. C. P. Newcombe. Suspended sediment in aquatic ecosystems: ill effects as a function of concentration and duration of exposure. Technical report, British Columbia Ministry of Environment, Lands and Parks, Habitat Protection branch, Victoria, 1994. C. P. Newcombe and J. O. T. Jensen. Channel suspended sediment and fisheries: A synthesis for quantitative assessment of risk and impact. North American Journal of Fisheries Management , 16(4):693--727, 1996. C. P. Newcombe and D. D. Macdonald. Effects of suspended sediments on aquatic ecosystems. North American Journal of Fisheries Management , 11(1):72--82, 1991. K. Schmidt-Nielsen. Scaling. Why is animal size so important? Cambridge University Press, NY, 1984. J. S. Schwartz, A. Simon, and L. Klimetz. Use of fish functional traits to associate in-stream suspended sediment transport metrics with biological impairment. Environmental Monitoring and Assessment , 179(1--4):347--369, 2011. E. Al Shaw and J. S. Richardson. Direct and indirect effects of sediment pulse duration on stream invertebrate assemb ages and rainbow trout ( Oncorhynchus mykiss ) growth and survival. Canadian Journal of Fish and Aquatic Science , 58:2213--2221, 2001. P. Tiwari and H. Hasegawa. {Demand for housing in Tokyo: A discrete choice analysis}. Regional Studies , {38}:{27--42}, {2004}. Y. Tramblay, A. Saint-Hilaire, T. B. M. J. Ouarda, F. Moatar, and B Hecht. Estimation of local extreme suspended sediment concentrations in california rivers. Science of the Total Environment , 408:4221--
Resumo:
The previous chapters gave an insightful introduction into the various facets of Business Process Management. We now share a rich understanding of the essential ideas behind designing and managing processes for organizational purposes. We have also learned about the various streams of research and development that have influenced contemporary BPM. As a matter of fact, BPM has become a holistic management discipline. As such, it requires that a plethora of facets needs to be addressed for its successful und sustainable application. This chapter provides a framework that consolidates and structures the essential factors that constitute BPM as a whole. Drawing from research in the field of maturity models, we suggest six core elements of BPM: strategic alignment, governance, methods, information technology, people, and culture. These six elements serve as the structure for this BPM Handbook.
Resumo:
In many cities around the world, surveillance by a pervasive net of CCTV cameras is a common phenomenon in an attempt to uphold safety and security across the urban environment. Video footage is being recorded and stored, sometimes live feeds are being watched in control rooms hidden from public access and view. In this study, we were inspired by Steve Mann’s original work on sousveillance (surveillance from below) to examine how a network of camera equipped urban screens could allow the residents of Oulu in Finland to collaborate on the safekeeping of their city. An agile, rapid prototyping process led to the design, implementation and ‘in the wild’ deployment of the UbiOpticon screen application. Live video streams captured by web cams integrated at the top of 12 distributed urban screens were broadcast and displayed in a matrix arrangement on all screens. The matrix also included live video streams of two roaming mobile phone cameras. In our field study we explored the reactions of passers-by and users of this screen application that seeks to inverse Bentham’s original panopticon by allowing the watched to be watchers at the same time. In addition to the original goal of participatory sousveillance, the system’s live video feature sparked fun and novel user-led apprlopriations.
Resumo:
United States copyright law -- two streams of computer copyright cases form basis for 'look and feel' litigation, literary work stream and audiovisual work stream -- literary work stream focuses on structure -- audiovisual work steam addresses appearance -- case studies
Resumo:
Accurate prediction of incident duration is not only important information of Traffic Incident Management System, but also an ffective input for travel time prediction. In this paper, the hazard based prediction odels are developed for both incident clearance time and arrival time. The data are obtained from the Queensland Department of Transport and Main Roads’ STREAMS Incident Management System (SIMS) for one year ending in November 2010. The best fitting distributions are drawn for both clearance and arrival time for 3 types of incident: crash, stationary vehicle, and hazard. The results show that Gamma, Log-logistic, and Weibull are the best fit for crash, stationary vehicle, and hazard incident, respectively. The obvious impact factors are given for crash clearance time and arrival time. The quantitative influences for crash and hazard incident are presented for both clearance and arrival. The model accuracy is analyzed at the end.
Resumo:
The need for strong science, technology and innovation linkages between Higher Education Institutions (HEIs) and industries is a pivotal point for middle-income countries in their endeavor to enhance human capital in socioeconomic development. Currently, the University-Industry partnerships are at an infant stage in Sri Lankan higher education context. Technological maturity and effective communication skills are contributing factors for an efficient graduate profile. Also, expanding internship programs in particular for STEM disciplines provide work experience to students that would strengthen the relevance of higher education programs. This study reports historical overviews and current trends in STEM education in Sri Lanka. Emphasis will be drawn to recent technological and higher education curricular reforms. Data from the last 10 years were extracted from the higher education sector and Ministry of Higher Education Policy portfolios. Associations and trend analysis of the sector growth were compared with STEM existence, merger and predicted augmentations. Results were depicted and summarised based on STEM streams and disciplines. It was observed that the trend of STEM augmentation in the Sri Lankan Higher Education context is growing at a slow but steady pace. Further analysis with other sectors in particular, Industry information, would be useful and a worthwhile exercise.
Resumo:
As part of a wider study to develop an ecosystem-health monitoring program for wadeable streams of south-eastern Queensland, Australia, comparisons were made regarding the accuracy, precision and relative efficiency of single-pass backpack electrofishing and multiple-pass electrofishing plus supplementary seine netting to quantify fish assemblage attributes at two spatial scales (within discrete mesohabitat units and within stream reaches consisting of multiple mesohabitat units). The results demonstrate that multiple-pass electrofishing plus seine netting provide more accurate and precise estimates of fish species richness, assemblage composition and species relative abundances in comparison to single-pass electrofishing alone, and that intensive sampling of three mesohabitat units (equivalent to a riffle-run-pool sequence) is a more efficient sampling strategy to estimate reach-scale assemblage attributes than less intensive sampling over larger spatial scales. This intensive sampling protocol was sufficiently sensitive that relatively small differences in assemblage attributes (<20%) could be detected with a high statistical power (1-β > 0.95) and that relatively few stream reaches (<4) need be sampled to accurately estimate assemblage attributes close to the true population means. The merits and potential drawbacks of the intensive sampling strategy are discussed, and it is deemed to be suitable for a range of monitoring and bioassessment objectives.
Resumo:
This article develops methods for spatially predicting daily change of dissolved oxygen (Dochange) at both sampled locations (134 freshwater sites in 2002 and 2003) and other locations of interest throughout a river network in South East Queensland, Australia. In order to deal with the relative sparseness of the monitoring locations in comparison to the number of locations where one might want to make predictions, we make a classification of the river and stream locations. We then implement optimal spatial prediction (ordinary and constrained kriging) from geostatistics. Because of their directed-tree structure, rivers and streams offer special challenges. A complete approach to spatial prediction on a river network is given, with special attention paid to environmental exceedances. The methodology is used to produce a map of Dochange predictions for 2003. Dochange is one of the variables measured as part of the Ecosystem Health Monitoring Program conducted within the Moreton Bay Waterways and Catchments Partnership.
Resumo:
Catchment and riparian degradation has resulted in declining ecosystem health of streams worldwide. With restoration a priority in many regions, there is an increasing interest in the scale at which land use influences stream ecosystem health. Our goal was to use a substantial data set collected as part of a monitoring program (the Southeast Queensland, Australia, Ecological Health Monitoring Program data set, collected at 116 sites over six years) to identify the spatial scale of land use, or the combination of spatial scales, that most strongly influences overall ecosystem health. In addition, we aimed to determine whether the most influential scale differed for different aspects of ecosystem health. We used linear-mixed models and a Bayesian model-averaging approach to generate models for the overall aggregated ecosystem health score and for each of the five component indicators (fish, macroinvertebrates, water quality, nutrients, and ecosystem processes) that make up the score. Dense forest close to the survey site, mid-dense forest in the hydrologically active nearstream areas of the catchment, urbanization in the riparian buffer, and tree cover at the reach scale were all significant in explaining ecosystem health, suggesting an overriding influence of forest cover, particularly close to the stream. Season and antecedent rainfall were also important explanatory variables, with some land-use variables showing significant seasonal interactions. There were also differential influences of land use for each of the component indicators. Our approach is useful given that restoring general ecosystem health is the focus of many stream restoration projects; it allowed us to predict the scale and catchment position of restoration that would result in the greatest improvement of ecosystem health in the regions streams and rivers. The models we generated suggested that good ecosystem health can be maintained in catchments where 80% of hydrologically active areas in close proximity to the stream have mid-dense forest cover and moderate health can be obtained with 60% cover.