246 resultados para vendors
Resumo:
Substation Automation Systems have undergone many transformational changes triggered by improvements in technologies. Prior to the digital era, it made sense to confirm that the physical wiring matched the schematic design by meticulous and laborious point to point testing. In this way, human errors in either the design or the construction could be identified and fixed prior to entry into service. However, even though modern secondary systems today are largely computerised, we are still undertaking commissioning testing using the same philosophy as if each signal were hard wired. This is slow and tedious and doesn’t do justice to modern computer systems and software automation. One of the major architectural advantages of the IEC 61850 standard is that it “abstracts” the definition of data and services independently of any protocol allowing the mapping of them to any protocol that can meet the modelling and performance requirements. On this basis, any substation element can be defined using these common building blocks and are made available at the design, configuration and operational stages of the system. The primary advantage of accessing data using this methodology rather than the traditional position method (such as DNP 3.0) is that generic tools can be created to manipulate data. Self-describing data contains the information that these tools need to manipulate different data types correctly. More importantly, self-describing data makes the interface between programs robust and flexible. This paper proposes that the improved data definitions and methods for dealing with this data within a tightly bound and compliant IEC 61850 Substation Automation System could completely revolutionise the need to test systems when compared to traditional point to point methods. Using the outcomes of an undergraduate thesis project, we can demonstrate with some certainty that it is possible to automatically test the configuration of a protection relay by comparing the IEC 61850 configuration extracted from the relay against its SCL file for multiple relay vendors. The software tool provides a quick and automatic check that the data sets on a particular relay are correct according to its CID file, thus ensuring that no unexpected modifications are made at any stage of the commissioning process. This tool has been implemented in a Java programming environment using an open source IEC 61850 library to facilitate the server-client association with the relay.
Resumo:
Service compositions enable users to realize their complex needs as a single request. Despite intensive research, especially in the area of business processes, web services and grids, an open and valid question is still how to manage service compositions in order to satisfy both functional and non-functional requirements as well as adapt to dynamic changes. In this paper we propose an (functional) architecture for adaptive management of QoS-aware service compositions. Comparing to the other existing architectures this one offers two major advantages. Firstly, this architecture supports various execution strategies based on dynamic selection and negotiation of services included in a service composition, contracting based on service level agreements, service enactment with flexible support for exception handling, monitoring of service level objectives, and profiling of execution data. Secondly, the architecture is built on the basis of well know existing standards to communicate and exchange data, which significantly reduces effort to integrate existing solutions and tools from different vendors. A first prototype of this architecture has been implemented within an EU-funded Adaptive Service Grid project. © 2006 Springer-Verlag.
Resumo:
Big Data and predictive analytics have received significant attention from the media and academic literature throughout the past few years, and it is likely that these emerging technologies will materially impact the mining sector. This short communication argues, however, that these technological forces will probably unfold differently in the mining industry than they have in many other sectors because of significant differences in the marginal cost of data capture and storage. To this end, we offer a brief overview of what Big Data and predictive analytics are, and explain how they are bringing about changes in a broad range of sectors. We discuss the “N=all” approach to data collection being promoted by many consultants and technology vendors in the marketplace but, by considering the economic and technical realities of data acquisition and storage, we then explain why a “n « all” data collection strategy probably makes more sense for the mining sector. Finally, towards shaping the industry’s policies with regards to technology-related investments in this area, we conclude by putting forward a conceptual model for leveraging Big Data tools and analytical techniques that is a more appropriate fit for the mining sector.
Resumo:
Urban planning policies in Australia presuppose apartments as the new dominant housing type, but much of what the market has delivered is criticised as over-development, and as being generic, poorly-designed, environmentally unsustainable and unaffordable. Policy responses to this problem typically focus on planning regulation and construction costs as the primary issues needing to be addressed in order to increase the supply of quality, affordable apartment housing. In contrast, this paper uses Ball’s (1983) ‘structures of provision’ approach to outline the key processes informing apartment development and identifies a substantial gap in critical understanding of how apartments are developed in Australia. This reveals economic problems not typically considered by policymakers. Using mainstream economic analysis to review the market itself, the authors found high search costs, demand risk, problems with exchange, and lack of competition present key barriers to achieving greater affordability and limit the extent to which ‘speculative’ developers can respond to the preferences of would be owner-occupiers of apartments. The existing development model, which is reliant on capturing uplift in site value, suits investors seeking rental yields in the first instance and capital gains in the second instance, and actively encourages housing price inflation. This is exacerbated by lack of density restrictions, such as have existed in inner Melbourne for many years, which permits greater yields on redevelopment sites. The price of land in the vicinity of such redevelopment sites is pushed up as landholders' expectation of future yield is raised. All too frequently existing redevelopment sites go back onto the market as vendors seek to capture the uplift in site value and exit the project in a risk free manner...
Resumo:
Reviewers' ratings have become one of the most influential parameters when making a decision to purchase or rent the products or services from the online vendors. Star Rating system is the de-facto standard for rating a product. It is regarded as one of the most visually appealing rating systems that directly interact with the consumers; helping them find products they will like to purchase as well as register their views on the product. It offers visual advantage to pick the popular or most rated product. Any system that is not as appealing as star system will have a chance of rejection by online business community. This paper argues that, the visual advantage is not enough to declare star rating system as a triumphant, the success of a ranking system should be measured by how effectively the system helps customers make decisions that they, retrospectively, consider correct. This paper argues and suggests a novel approach of Relative Ranking within the boundaries of star rating system to overcome a few inherent disadvantages the former system comes with. © Springer Science+Business Media B.V. 2010.
Resumo:
Business processes and application functionality are becoming available as internal web services inside enterprise boundaries as well as becoming available as commercial web services from enterprise solution vendors and web services marketplaces. Typically there are multiple web service providers offering services capable of fulfilling a particular functionality, although with different Quality of Service (QoS). Dynamic creation of business processes requires composing an appropriate set of web services that best suit the current need. This paper presents a novel combinatorial auction approach to QoS aware dynamic web services composition. Such an approach would enable not only stand-alone web services but also composite web services to be a part of a business process. The combinatorial auction leads to an integer programming formulation for the web services composition problem. An important feature of the model is the incorporation of service level agreements. We describe a software tool QWESC for QoS-aware web services composition based on the proposed approach.
Resumo:
The future of civic engagement is characterised by both technological innovation as well as new technological user practices that are fuelled by trends towards mobile, personal devices; broadband connectivity; open data; urban interfaces; and cloud computing. These technology trends are progressing at a rapid pace, and have led global technology vendors to package and sell the “Smart City” as a centralised service delivery platform predicted to optimise and enhance cities’ key performance indicators – and generate a profitable market. The top-down deployment of these large and proprietary technology platforms have helped sectors such as energy, transport, and healthcare to increase efficiencies. However, an increasing number of scholars and commentators warn of another “IT bubble” emerging. Along with some city leaders, they argue that the top-down approach does not fit the governance dynamics and values of a liberal democracy when applied across sectors. A thorough understanding is required, of the socio-cultural nuances of how people work, live, play across different environments, and how they employ social media and mobile devices to interact with, engage in, and constitute public realms. Although the term “slacktivism” is sometimes used to denote a watered down version of civic engagement and activism that is reduced to clicking a “Like” button and signing online petitions, we believe that we are far from witnessing another Biedermeier period that saw people focus on the domestic and the non-political. There is plenty of evidence to the contrary, such as post-election violence in Kenya in 2008, the Occupy movements in New York, Hong Kong and elsewhere, the Arab Spring, Stuttgart 21, Fukushima, the Taksim Gezi Park in Istanbul, and the Vinegar Movement in Brazil in 2013. These examples of civic action shape the dynamics of governments, and in turn, call for new processes to be incorporated into governance structures. Participatory research into these new processes across the triad of people, place and technology is a significant and timely investment to foster productive, sustainable, and liveable human habitats. With this article, we want to reframe the current debates in academia and priorities in industry and government to allow citizens and civic actors to take their rightful centrepiece place in civic movements. This calls for new participatory approaches for co-inquiry and co-design. It is an evolving process with an explicit agenda to facilitate change, and we propose participatory action research (PAR) as an indispensable component in the journey to develop new governance infrastructures and practices for civic engagement. We do not limit our definition of civic technologies to tools specifically designed to simply enhance government and governance, such as renewing your car registration online or casting your vote electronically on election day. Rather, we are interested in civic media and technologies that foster citizen engagement in the widest sense, and particularly the participatory design of such civic technologies that strive to involve citizens in political debate and action as well as question conventional approaches to political issues. The rationale for this approach is an alternative to smart cities in a “perpetual tomorrow,” based on many weak and strong signals of civic actions revolving around technology seen today. It seeks to emphasise and direct attention to active citizenry over passive consumerism, human actors over human factors, culture over infrastructure, and prosperity over efficiency. First, we will have a look at some fundamental issues arising from applying simplistic smart city visions to the kind of a problem a city poses. We focus on the touch points between “the city” and its civic body, the citizens. In order to provide for meaningful civic engagement, the city must provide appropriate interfaces.
Resumo:
Triggered by the very quick proliferation of Internet connectivity, electronic document management (EDM) systems are now rapidly being adopted for managing the documentation that is produced and exchanged in construction projects. Nevertheless there are still substantial barriers to the efficient use of such systems, mainly of a psychological nature and related to insufficient training. This paper presents the results of empirical studies carried out during 2002 concerning the current usage of EDM systems in the Finnish construction industry. The studies employed three different methods in order to provide a multifaceted view of the problem area, both on the industry and individual project level. In order to provide an accurate measurement of overall usage volume in the industry as a whole telephone interviews with key personnel from 100 randomly chosen construction projects were conducted. The interviews showed that while around 1/3 of big projects already have adopted the use of EDM, very few small projects have adopted this technology. The barriers to introduction were investigated through interviews with representatives for half a dozen of providers of systems and ASP-services. These interviews shed a lot of light on the dynamics of the market for this type of services and illustrated the diversity of business strategies adopted by vendors. In the final study log files from a project which had used an EDM system were analysed in order to determine usage patterns. The results illustrated that use is yet incomplete in coverage and that only a part of the individuals involved in the project used the system efficiently, either as information producers or consumers. The study also provided feedback on the usefulness of the log files.
Resumo:
Wireless LAN (WLAN) market consists of IEEE 802.11 MAC standard conformant devices (e.g., access points (APs), client adapters) from multiple vendors. Certain third party certifications such as those specified by the Wi-Fi alliance have been widely used by vendors to ensure basic conformance to the 802.11 standard, thus leading to the expectation that the available devices exhibit identical MAC level behavior. In this paper, however, we present what we believe to be the first ever set of experimental results that highlight the fact that WLAN devices from different vendors in the market can have heterogeneous MAC level behavior. Specifically, we demonstrate with examples and data that in certain cases, devices may not be conformant with the 802.11 standard while in other cases, they may differ in significant details that are not a part of mandatory specifications of the standard. We argue that heterogeneous MAC implementations can adversely impact WLAN operations leading to unfair bandwidth allocation, potential break-down of related MAC functionality and difficulties in provisioning the capacity of a WLAN. However, on the positive side, MAC level heterogeneity can be useful in applications such as vendor/model level device fingerprinting.
Resumo:
Large software systems are developed by composing multiple programs. If the programs manip-ulate and exchange complex data, such as network packets or files, it is essential to establish that they follow compatible data formats. Most of the complexity of data formats is associated with the headers. In this paper, we address compatibility of programs operating over headers of network packets, files, images, etc. As format specifications are rarely available, we infer the format associated with headers by a program as a set of guarded layouts. In terms of these formats, we define and check compatibility of (a) producer-consumer programs and (b) different versions of producer (or consumer) programs. A compatible producer-consumer pair is free of type mismatches and logical incompatibilities such as the consumer rejecting valid outputs gen-erated by the producer. A backward compatible producer (resp. consumer) is guaranteed to be compatible with consumers (resp. producers) that were compatible with its older version. With our prototype tool, we identified 5 known bugs and 1 potential bug in (a) sender-receiver modules of Linux network drivers of 3 vendors and (b) different versions of a TIFF image library.
Resumo:
The growing number of applications and processing units in modern Multiprocessor Systems-on-Chips (MPSoCs) come along with reduced time to market. Different IP cores can come from different vendors, and their trust levels are also different, but typically they use Network-on-Chip (NoC) as their communication infrastructure. An MPSoC can have multiple Trusted Execution Environments (TEEs). Apart from performance, power, and area research in the field of MPSoC, robust and secure system design is also gaining importance in the research community. To build a secure system, the designer must know beforehand all kinds of attack possibilities for the respective system (MPSoC). In this paper we survey the possible attack scenarios on present-day MPSoCs and investigate a new attack scenario, i.e., router attack targeted toward NoC architecture. We show the validity of this attack by analyzing different present-day NoC architectures and show that they are all vulnerable to this type of attack. By launching a router attack, an attacker can control the whole chip very easily, which makes it a very serious issue. Both routing tables and routing logic-based routers are vulnerable to such attacks. In this paper, we address attacks on routing tables. We propose different monitoring-based countermeasures against routing table-based router attack in an MPSoC having multiple TEEs. Synthesis results show that proposed countermeasures, viz. Runtime-monitor, Restart-monitor, Intermediate manager, and Auditor, occupy areas that are 26.6, 22, 0.2, and 12.2 % of a routing table-based router area. Apart from these, we propose Ejection address checker and Local monitoring module inside a router that cause 3.4 and 10.6 % increase of a router area, respectively. Simulation results are also given, which shows effectiveness of proposed monitoring-based countermeasures.
Resumo:
A useful insight into managerial decision making can be found from simulation of business systems, but existing work on simulation of supply chain behaviour has largely considered non-competitive chains. Where competitive agents have been examined, they have generally had a simple structure and been used for fundamental examination of stability and equilibria rather than providing practical guidance to managers. In this paper, a new agent for the study of competitive supply chain network dynamics is proposed. The novel features of the agent include the ability to select between competing vendors, distribute orders preferentially among many customers, manage production and inventory, and determine price based on competitive behaviour. The structure of the agent is related to existing business models and sufficient details are provided to allow implementation. The agent is tested to demonstrate that it recreates the main results of the existing modelling and management literature on supply chain dynamics. A brief exploration of competitive dynamics is given to confirm that the proposed agent can respond to competition. The results demonstrate that overall profitability for a supply chain network is maximised when businesses operate collectively. It is possible for an individual business to achieve higher profits by adopting a more competitive stance, but the consequence of this is that the overall profitability of the network is reduced. The agent will be of use for a broad range of studies on the long-run effect of management decisions on their network of suppliers and customers.
Resumo:
For the first time in its history, the International Symposium on Sea Turtle Biology and Conservation migrated to a site outside of the United States. Thus the Eighteenth edition was hosted by the Mazatlán Research Unit of the Instituto de Ciencias del Mar y Limnología of the Mexican National Autonomous University (UNAM) in Mazatlán, Sinaloa (Mexico) where it was held from 3-7, March, 1998. Above all, our symposium is prominent for its dynamism and enthusiasm in bringing together specialists from the world´s sea turtle populations. In an effort to extend this philosophy, and fully aware of how fast the interest in sea turtles has grown, the organizers paid special attention to bring together as many people as possible. With the tremendous efforts of the Travel Committee and coupled with a special interest by the Latin American region´s devotees, we managed to get 653 participants from 43 countries. The number of presentations increased significantly too, reaching a total of 265 papers, ranging from cutting-edge scientific reports based on highly sophisticated methods, to the experiences and successes of community-based and environmental education programs. A priority given by this symposium was the support and encouragement for the construction of "bridges" across cultural and discipline barriers. We found success in achieving a multinational dialogue among interest groups- scientists, resource managers, decision makers, ngo's, private industry. There was a broad representation of the broad interests that stretch across these sectors, yet everyone was able to listen and offer their own best contribution towards the central theme of the Symposium: the conservation of sea turtles and the diversity of marine and coastal environments in which they develop through their complicated and protracted life cycle. Our multidisciplinary approach is highly important at the present, finding ourselves at a cross roads of significant initiatives in the international arena of environmental law, where the conservation of sea turtles has a key role to play. Many, many people worked hard over the previous 12 months, to make the symposium a success. Our sincerest thanks to all of them: Program committee: Laura Sarti (chair), Ana Barragán, Rod Mast, Heather Kalb, Jim Spotilla, Richard Reina, Sheryan Epperly, Anna Bass, Steve Morreale, Milani Chaloupka, Robert Van Dam, Lew Ehrhart, J. Nichols, David Godfrey, Larry Herbst, René Márquez, Jack Musick, Peter Dutton, Patricia Huerta, Arturo Juárez, Debora Garcia, Carlos Suárez, German Ramírez, Raquel Briseño, Alberto Abreu; Registration and Secretary: Jane Provancha (chair), Lupita Polanco; Informatics: Germán Ramírez, Carlos Suárez; Cover art: Blas Nayar; Designs: Germán Ramírez, Raquel Briseño, Alberto Abreu. Auction: Rod Mast; Workshops and special meetings: Selina Heppell; Student prizes: Anders Rhodin; Resolutions committee: Juan Carlos Cantú; Local organizing committee: Raquel Briseño, Jane Abreu; Posters: Daniel Ríos and Jeffrey Semminoff; Travel committee: Karen Eckert (chair), Marydele Donnelly, Brendan Godley, Annette Broderick, Jack Frazier; Student travel: Francisco Silva and J. Nichols; Vendors: Tom McFarland and J. Nichols; Volunteer coordination: Richard Byles; Latin American Reunión: Angeles Cruz Morelos; Nominations committee: Randall Arauz, Colleen Coogan, Laura Sarti, Donna Shaver, Frank Paladino. Once again, Ed Drane worked his usual magic with the Treasury of the Symposium Significant financial contributions were generously provided by government agencies. SEMARNAP (Mexico´s Ministry of Environment, Natural Resources and Fisheries) through its central office, the Mazatlán Regional Fisheries Research Center (CRIP-Mazatlán) and the National Center for Education and Capacity Building for Sustainable Development (CECADESU) contributed to the logistics and covered the costs of auditoria and audiovisual equipment for the Symposium, teachers and their hotels for the Community Development and Environmental Education workshop in the 5th Latin American Sea Turtle Specialists; DIF (Dept of Family Affairs) provided free accomodation and food for the more than 100 participants in the Latin American Reunion. In this Reunion, the British Council-Mexico sponsored the workshop on the Project Cycle. The National Chamber of the Fisheries Industry (CANAINPES) kindly sponsored the Symposium´s coffee breaks. Personnel from the local Navy (Octave Zona Naval) provided invaluable aid in transport and logistics. The Scientific Coordination Office from UNAM (CICUNAM) and the Latin American Biology Network (RELAB) also provided funding. Our most sincere recognition to all of them. In the name of this Symposium´s compilers, I would like to also express our gratitude to Wayne Witzell, Technical Editor for his guidance and insights and to Jack Frazier for his help in translating and correcting the English of contributions from some non-native English speakers. Many thanks to Angel Fiscal and Tere Martin who helped with the typing in the last, last corrections and editions for these Proceedings. To all, from around the world, who generously helped make the 18th Symposium a huge success, shared their experiences and listened to ours, our deepest gratitude! (PDF contains 316 pages)
Proceedings fo the Seventeenth Annual Sea Turtle Symposium, 4-8 March 1997, Orlando, Florida, U.S.A.
Resumo:
The 17th Annual Sea Turtle Symposium was held at the Delta Orlando Resort in Orlando, Florida U.S.A. from March 4-8, 1997. The symposium was hosted by Florida Atlantic University, Mote Marine Laboratory, University of Central Florida, University of Florida, Florida Atlantic University and the Comité Nacional para la Conservación y Protección de las Totugas Marinas. The 17th was the largest symposium to date. A total of 720 participants registered, including sea turtle biologists, students, regulatory personnel, managers, and volunteers representing 38 countries. In addition to the United States, participants represented Australia, Austria, the Bahamas, Bonaire, Bermuda, Brazil, Canada, Colombia, Costa Rica, Croatia, Cuba, Cyprus, Dominican Republic, Ecuador, England, Guatemala, Greece, Honduras, India, Italy, Japan, Madagascar, Malaysia, Mexico, The Netherlands, Nicaragua, Peru, Philippines, Republic of Seychelles, Scotland, Spain, Sri Lanka, Switzerland, Taiwan, Turkey, Uruguay, and Venezuela. In addition to the 79 oral, 2 video, and 120 poster presentations, 3 workshops were offered: Selina Heppell (Duke University Marine Laboratory) provided “Population Modeling,” Mike Walsh and Sam Dover (Sea World-Orlando) conducted “Marine Turtle Veterinary Medicine” and “Conservation on Nesting Beaches” was offered by Blair Witherington and David Arnold (Florida Department of Environmental Protection). On the first evening, P.C.H. Pritchard delivered a thoughtful retrospect on Archie Carr that showed many sides of a complex man who studied and wrote about sea turtles. It was a presentation that none of us will forget. The members considered a number of resolutions at the Thursday business meeting and passed six. Five of these resolutions are presented in the Commentaries and Reviews section of Chelonian Conservation and Biology 2(3):442-444 (1997). The symposium was fortunate to have many fine presentations competing for the Archie Carr Best Student Presentations awards. The best oral presentation award went to Amanda Southwood (University of British Columbia) for “Heart rates and dive behavior of the leatherback sea turtle during the internesting interval.” The two runners-up were Richard Reina (Australian National University) for “Regulation of salt gland activity in Chelonia mydas” and Singo Minamikawa (Kyoto University) for “The influence that artificial specific gravity change gives to diving behavior of loggerhead turtles”. The winner of this year’s best poster competition was Mark Roberts (University of South Florida) for his poster entitled “Global population structure of green sea Turtles (Chelonia mydas) using microsatellite analysis of male mediated gene flow.” The two runners-up were Larisa Avens (University of North Carolina-Chapel Hill) for “Equilibrium responses to rotational displacements by hatchling sea turtles: maintaining a migratory heading in a turbulent ocean” and Annette Broderick (University of Glasgow) for “Female size, not length, is a correlate of reproductive output.” The symposium was very fortunate to receive a matching monetary and subscription gift from Anders J. G. Rhodin of the Chelonian Research Foundation. These enabled us to more adequately reward the fine work of students. The winners of the best paper and best poster awards received $400 plus a subscription to Chelonian Conservation and Biology. Each runner up received $100. The symposium owes a great debt to countless volunteers who helped make the meeting a success. Those volunteers include: Jamie Serino, Alan Bolton, and Karen Bjorndal, along with the UF students provided audio visual help, John Keinath chaired the student awards committee, Mike Salmon chaired the Program Commiteee, Sheryan Epperly and Joanne Braun compiled the Proceedings, Edwin Drane served as treasurer and provided much logistical help, Jane Provancha coordinated volunteers, Thelma Richardson conducted registration, Vicki Wiese coordinated food and beverage services, Jamie Serino and Erik Marin coordinated entertainment, Kenneth Dodd oversaw student travel awards, Traci Guynup, Tina Brown, Jerris Foote, Dan Hamilton, Richie Moretti, and Vicki Wiese served on the time and place committee, Blair Witherington created the trivia quiz, Tom McFarland donated the symposium logo, Deborah Crouse chaired the resolutions committee, Pamela Plotkin chaired the nominations committee, Sally Krebs, Susan Schenk, and Larry Wood conducted the silent auction, and Beverly and Tom McFarland coordinated all 26 vendors. Many individuals from outside the United States were able to attend the 17th Annual Sea Turtle Symposium thanks to the tireless work of Karen Eckert, Marydele Donnelly, and Jack Frazier in soliciting travel assistance for a number of international participants. We are indebted to those donating money to the internationals’ housing fund (Flo Vetter Memorial Fund, Marinelife Center of Juno Beach, Roger Mellgren, and Jane Provancha). We raise much of our money for international travel from the auction; thanks go to auctioneer Bob Shoop, who kept our auction fastpaced and entertaining, and made sure the bidding was high. The Annual Sea Turtle Symposium is unequaled in its emphasis on international participation. Through international participation we all learn a great deal more about the biology of sea turtles and the conservation issues that sea turtles face in distant waters. Additionally, those attending the symposium come away with a tremendous wealth of knowledge, professional contacts, and new friendships. The Annual Sea Turtle Symposium is a meeting in which pretenses are dropped, good science is presented, and friendly, open communication is the rule. The camaraderie that typifies these meetings ultimately translates into understanding and cooperation. These aspects, combined, have gone and will go a long way toward helping to protect marine turtles and toward aiding their recovery on a global scale. (PDF contains 342 pages)
Resumo:
The Alliance for Coastal Technologies (ACT) Workshop "Technologies and Methodologies for the Detection of Harmful Algae and their Toxins" convened in St. Petersburg, Florida, October 22- 24, 2008 and was co-sponsored by ACT (http://act-us.info); the Cooperative Institute for Coastal and Estuarine Environmental Technology (CICEET, http://ciceet.unh.edu); and the Florida Fish and Wildlife Conservation Commission (FWC, http://www.myfwc.com). Participants from various sectors, including researchers, coastal decision makers, and technology vendors, collaborated to exchange information and build consensus. They focused on the status of currently available detection technologies and methodologies for harmful algae (HA) and their toxins, provided direction for developing operational use of existing technology, and addressed requirements for future technology developments in this area. Harmful algal blooms (HABs) in marine and freshwater systems are increasingly common worldwide and are known to cause extensive ecological, economic, and human health problems. In US waters, HABs are encountered in a growing number of locations and are also increasing in duration and severity. This expansion in HABs has led to elevated incidences of poisonous seafood, toxin-contaminated drinking water, mortality of fish and other animals dependent upon aquatic resources (including protected species), public health and economic impacts in coastal and lakeside communities, losses to aquaculture enterprises, and long-term aquatic ecosystem changes. This meeting represented the fourth ACT sponsored workshop that has addressed technology developments for improved monitoring of water-born pathogens and HA species in some form. A primary motivation was to assess the need and community support for an ACT-led Performance Demonstration of Harmful Algae Detection Technologies and Methodologies in order to facilitate their integration into regional ocean observing systems operations. The workshop focused on the identification of region-specific monitoring needs and available technologies and methodologies for detection/quantification of harmful algal species and their toxins along the US marine and freshwater coasts. To address this critical environmental issue, several technologies and methodologies have been, or are being, developed to detect and quantify various harmful algae and their associated toxins in coastal marine and freshwater environments. There are many challenges to nationwide adoption of HAB detection as part of a core monitoring infrastructure: the geographic uniqueness of primary algal species of concern around the country, the variety of HAB impacts, and the need for a clear vision of the operational requirements for monitoring the various species. Nonetheless, it was a consensus of the workshop participants that ACT should support the development of HA detection technology performance demonstrations but that these would need to be tuned regionally to algal species and toxins of concern in order to promote the adoption of state of the art technologies into HAR monitoring networks. [PDF contains 36 pages]