997 resultados para Remote Collaboration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Creative and ad-hoc work often involves non-digital artifacts, such as whiteboards and post-it notes. The preferred method of brainstorming and idea development, while facilitating work among collocated participants, makes it particularly tricky to involve remote participants, not even mentioning cases where live social involvement is required and the number and location of remote participants can be vast. Our work has originally focused on large distributed teams in business entities. Vast majority of teams in large organizations are distributed teams. Our team of corporate researchers decided to identify state of the art technologies that could facilitate the scenarios mentioned above. This paper is an account of a research project in the area of enterprise collaboration, with a strong focus on the aspects of human computer interaction in mixed mode environments, especially in areas of collaboration where computers still play a secondary role. It is describing a currently running corporate research project. In this paper we signal the potential use of the technology in situation, where community involvement is either required or desirable. The goal of the paper is to initiate a discussion on the use of technologies, initially designed as supporting enterprise collaboration, in situation requiring community engagement. In other words, it is a contribution of technically focused research exploring the uses of the technology in areas such as social engagement and community involvement. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Alliance for Coastal Technologies (ACT) convened a workshop, sponsored by the Hawaii-Pacific and Alaska Regional Partners, entitled Underwater Passive Acoustic Monitoring for Remote Regions at the Hawaii Institute of Marine Biology from February 7-9, 2007. The workshop was designed to summarize existing passive acoustic technologies and their uses, as well as to make strategic recommendations for future development and collaborative programs that use passive acoustic tools for scientific investigation and resource management. The workshop was attended by 29 people representing three sectors: research scientists, resource managers, and technology developers. The majority of passive acoustic tools are being developed by individual scientists for specific applications and few tools are available commercially. Most scientists are developing hydrophone-based systems to listen for species-specific information on fish or cetaceans; a few scientists are listening for biological indicators of ecosystem health. Resource managers are interested in passive acoustics primarily for vessel detection in remote protected areas and secondarily to obtain biological and ecological information. The military has been monitoring with hydrophones for decades;however, data and signal processing software has not been readily available to the scientific community, and future collaboration is greatly needed. The challenges that impede future development of passive acoustics are surmountable with greater collaboration. Hardware exists and is accessible; the limits are in the software and in the interpretation of sounds and their correlation with ecological events. Collaboration with the military and the private companies it contracts will assist scientists and managers with obtaining and developing software and data analysis tools. Collaborative proposals among scientists to receive larger pools of money for exploratory acoustic science will further develop the ability to correlate noise with ecological activities. The existing technologies and data analysis are adequate to meet resource managers' needs for vessel detection. However, collaboration is needed among resource managers to prepare large-scale programs that include centralized processing in an effort to address the lack of local capacity within management agencies to analyze and interpret the data. Workshop participants suggested that ACT might facilitate such collaborations through its website and by providing recommendations to key agencies and programs, such as DOD, NOAA, and I00s. There is a need to standardize data formats and archive acoustic environmental data at the national and international levels. Specifically, there is a need for local training and primers for public education, as well as by pilot demonstration projects, perhaps in conjunction with National Marine Sanctuaries. Passive acoustic technologies should be implemented immediately to address vessel monitoring needs. Ecological and health monitoring applications should be developed as vessel monitoring programs provide additional data and opportunities for more exploratory research. Passive acoustic monitoring should also be correlated with water quality monitoring to ease integration into long-term monitoring programs, such as the ocean observing systems. [PDF contains 52 pages]

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes applications of cavity enhanced spectroscopy towards applications of remote sensing, chemical kinetics and detection of transient radical molecular species. Both direct absorption spectroscopy and cavity ring-down spectroscopy are used in this work. Frequency-stabilized cavity ring-down spectroscopy (FS-CRDS) was utilized for measurements of spectral lineshapes of O2 and CO2 for obtaining laboratory reference data in support of NASA’s OCO-2 mission. FS-CRDS is highly sensitive (> 10 km absorption path length) and precise (> 10000:1 SNR), making it ideal to study subtle non-Voigt lineshape effects. In addition, these advantages of FS-CRDS were further extended for measuring kinetic isotope effects: A dual-wavelength variation of FS-CRDS was used for measuring precise D/H and 13C/12C methane isotope ratios (sigma>0.026%) for the purpose of measuring the temperature dependent kinetic isotope effects of methane oxidation with O(1D) and OH radicals. Finally, direct absorption spectroscopic detection of the trans-DOCO radical via a frequency combs spectrometer was conducted in collaboration with professor Jun Ye at JILA/University of Colorado.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Finnish Meteorological Institute, in collaboration with the University of Helsinki, has established a new ground-based remote-sensing network in Finland. The network consists of five topographically, ecologically and climatically different sites distributed from southern to northern Finland. The main goal of the network is to monitor air pollution and boundary layer properties in near real time, with a Doppler lidar and ceilometer at each site. In addition to these operational tasks, two sites are members of the Aerosols, Clouds and Trace gases Research InfraStructure Network (ACTRIS); a Ka band cloud radar at Sodankylä will provide cloud retrievals within CloudNet, and a multi-wavelength Raman lidar, PollyXT (POrtabLe Lidar sYstem eXTended), in Kuopio provides optical and microphysical aerosol properties through EARLINET (the European Aerosol Research Lidar Network). Three C-band weather radars are located in the Helsinki metropolitan area and are deployed for operational and research applications. We performed two inter-comparison campaigns to investigate the Doppler lidar performance, compare the backscatter signal and wind profiles, and to optimize the lidar sensitivity through adjusting the telescope focus length and data-integration time to ensure sufficient signal-to-noise ratio (SNR) in low-aerosol-content environments. In terms of statistical characterization, the wind-profile comparison showed good agreement between different lidars. Initially, there was a discrepancy in the SNR and attenuated backscatter coefficient profiles which arose from an incorrectly reported telescope focus setting from one instrument, together with the need to calibrate. After diagnosing the true telescope focus length, calculating a new attenuated backscatter coefficient profile with the new telescope function and taking into account calibration, the resulting attenuated backscatter profiles all showed good agreement with each other. It was thought that harsh Finnish winters could pose problems, but, due to the built-in heating systems, low ambient temperatures had no, or only a minor, impact on the lidar operation – including scanning-head motion. However, accumulation of snow and ice on the lens has been observed, which can lead to the formation of a water/ice layer thus attenuating the signal inconsistently. Thus, care must be taken to ensure continuous snow removal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The architecture of the new system uses Java language as programming environment. Since application parameters and hardware in a joint experiment are complex with a large variability of components, requirements and specification solutions need to be flexible and modular, independent from operating system and computer architecture. To describe and organize the information on all the components and the connections among them, systems are developed using the extensible Markup Language (XML) technology. The communication between clients and servers uses remote procedure call (RPC) based on the XML (RPC-XML technology). The integration among Java language, XML and RPC-XML technologies allows to develop easily a standard data and communication access layer between users and laboratories using common software libraries and Web application. The libraries allow data retrieval using the same methods for all user laboratories in the joint collaboration, and the Web application allows a simple graphical user interface (GUI) access. The TCABR tokamak team in collaboration with the IPFN (Instituto de Plasmas e Fusao Nuclear, Instituto Superior Tecnico, Universidade Tecnica de Lisboa) is implementing this remote participation technologies. The first version was tested at the Joint Experiment on TCABR (TCABRJE), a Host Laboratory Experiment, organized in cooperation with the IAEA (International Atomic Energy Agency) in the framework of the IAEA Coordinated Research Project (CRP) on ""Joint Research Using Small Tokamaks"". (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As information expands and comprehension becomes more complex, so the need increases to develop focused areas of knowledge and skill acquisition. However, as the number of specialty areas increases so the languages that define each separate knowledge base become increasingly remote. Hence, concepts and viewpoints that were once considered part of a whole become detached. This phenomenon is typical of the development of tertiary education, especially within professional oriented courses, where disciplines and sub-disciplines have grown further apart and the ability to communicate has become increasingly fragmented.
One individual and visionary who was well acquainted with the shortcomings of the piecemeal development between the disciplines was Professor Sir Edmond Happold, the leader of the prestigious group known as Structures 3 at Ove Arup and Partners, who were responsible for making happen some of the landmark buildings of their time, including Sydney Opera House and the Pompidou Centre, and the founding professor of the Bath school of Architecture and Civil Engineering in 1975. While still having a profound respect for the knowledge bases of the different professions within the building and construction industry, Professor Happold was also well aware of the extraordinary synergies in design and innovation which could come about when the disciplines of Architecture and Civil Engineering were brought together at the outset of the design process.
This paper discusses the rational behind Professor Happold’s cross-discipline model of education and reflects on the method, execution and pedagogical worth of the joint studio-based projects which formed a core aspect of the third year program at the School of Architecture and Civil Engineering at the Bath University.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims & Rationale/Objectives
The aim is to establish the frequency of counselling by general practitioners (GPs) and community pharmacists (CPs) for patients with uncontrolled CVD risk factors. This will identify conditions for which CPs might collaborate with GPs in addressing evidence-treatment gaps.

Methods
A population survey undertaken in the Wimmera region of Victoria in 2006. 1425 adults aged 25-84 yrs were randomly selected using age/sex stratified electoral role samples. A representative 723 participants were recruited.

Principal Findings
Data on GP and CP visits were available for 694 participants. Overall, participants visited GPs 4.6 times and CPs 6.0 times/annum. However, one third of participants never consulted a pharmacist in 12 months compared to just 11.5% for GPs. Among obese patients (BMI ?? 30), the average number of visits/annum was 4.5 to GPs and 6.8 to CPs. The equivalent numbers were 5.6 and 8.6 respectively for those with systolic BP ?? 140 mmHg; 3.7 and 5.5 for total cholesterol > 5.0 mmol/L; and, 6.7 and 14.6 for patients with random blood glucose concentrations ?? 7.0 mmol/L.

Implications

People with suboptimal status for most common CVD risk factor are counselled frequently by CPs. A coordinated approach with GPs to the delivery of cardiovascular health promotion could provide valuable reinforcement of key messages and offers greater opportunity to identify at-risk individuals. Acknowledgements: KM is a pharmacist-academic at Greater Green Triangle UDRH, a position funded by the Department of Health and Ageing through the Rural and Remote Pharmacy Workforce Development Program

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Australia, it is commonplace for tertiary mental health care to be provided in large regional centres or metropolitan cities. Rural and remote consumers must be transferred long distances, and this inevitably results in difficulties with the integration of their care between primary and tertiary settings. Because of the need to address these issues, and improve the transfer process, a research project was commissioned by a national government department to be conducted in South Australia. The aim of the project was to document the experiences of mental health consumers travelling from the country to the city for acute care and to make policy recommendations to improve transitions of care. Six purposively sampled case studies were conducted collecting data through semistructured interviews with consumers, country professional and occupational groups and tertiary providers. Data were analysed to produce themes for consumers, and country and tertiary mental healthcare providers. The study found that consumers saw transfer to the city for mental health care as beneficial in spite of the challenges of being transferred over long distances, while being very unwell, and of being separated from family and friends. Country care providers noted that the disjointed nature of the mental health system caused problems with key aspects of transfer of care including transport and information flow, and achieving integration between the primary and tertiary settings. Improving transfer of care involves overcoming the systemic barriers to integration and moving to a primary care-led model of care. The distance consultation and liaison model provided by the Rural and Remote Mental Health Services, the major tertiary provider of services for country consumers, uses a primary care-led approach and was highly regarded by research participants. Extending the use of this model to other primary mental healthcare providers and tertiary facilities will improve transfer of care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work described in this thesis aims to support the distributed design of integrated systems and considers specifically the need for collaborative interaction among designers. Particular emphasis was given to issues which were only marginally considered in previous approaches, such as the abstraction of the distribution of design automation resources over the network, the possibility of both synchronous and asynchronous interaction among designers and the support for extensible design data models. Such issues demand a rather complex software infrastructure, as possible solutions must encompass a wide range of software modules: from user interfaces to middleware to databases. To build such structure, several engineering techniques were employed and some original solutions were devised. The core of the proposed solution is based in the joint application of two homonymic technologies: CAD Frameworks and object-oriented frameworks. The former concept was coined in the late 80's within the electronic design automation community and comprehends a layered software environment which aims to support CAD tool developers, CAD administrators/integrators and designers. The latter, developed during the last decade by the software engineering community, is a software architecture model to build extensible and reusable object-oriented software subsystems. In this work, we proposed to create an object-oriented framework which includes extensible sets of design data primitives and design tool building blocks. Such object-oriented framework is included within a CAD Framework, where it plays important roles on typical CAD Framework services such as design data representation and management, versioning, user interfaces, design management and tool integration. The implemented CAD Framework - named Cave2 - followed the classical layered architecture presented by Barnes, Harrison, Newton and Spickelmier, but the possibilities granted by the use of the object-oriented framework foundations allowed a series of improvements which were not available in previous approaches: - object-oriented frameworks are extensible by design, thus this should be also true regarding the implemented sets of design data primitives and design tool building blocks. This means that both the design representation model and the software modules dealing with it can be upgraded or adapted to a particular design methodology, and that such extensions and adaptations will still inherit the architectural and functional aspects implemented in the object-oriented framework foundation; - the design semantics and the design visualization are both part of the object-oriented framework, but in clearly separated models. This allows for different visualization strategies for a given design data set, which gives collaborating parties the flexibility to choose individual visualization settings; - the control of the consistency between semantics and visualization - a particularly important issue in a design environment with multiple views of a single design - is also included in the foundations of the object-oriented framework. Such mechanism is generic enough to be also used by further extensions of the design data model, as it is based on the inversion of control between view and semantics. The view receives the user input and propagates such event to the semantic model, which evaluates if a state change is possible. If positive, it triggers the change of state of both semantics and view. Our approach took advantage of such inversion of control and included an layer between semantics and view to take into account the possibility of multi-view consistency; - to optimize the consistency control mechanism between views and semantics, we propose an event-based approach that captures each discrete interaction of a designer with his/her respective design views. The information about each interaction is encapsulated inside an event object, which may be propagated to the design semantics - and thus to other possible views - according to the consistency policy which is being used. Furthermore, the use of event pools allows for a late synchronization between view and semantics in case of unavailability of a network connection between them; - the use of proxy objects raised significantly the abstraction of the integration of design automation resources, as either remote or local tools and services are accessed through method calls in a local object. The connection to remote tools and services using a look-up protocol also abstracted completely the network location of such resources, allowing for resource addition and removal during runtime; - the implemented CAD Framework is completely based on Java technology, so it relies on the Java Virtual Machine as the layer which grants the independence between the CAD Framework and the operating system. All such improvements contributed to a higher abstraction on the distribution of design automation resources and also introduced a new paradigm for the remote interaction between designers. The resulting CAD Framework is able to support fine-grained collaboration based on events, so every single design update performed by a designer can be propagated to the rest of the design team regardless of their location in the distributed environment. This can increase the group awareness and allow a richer transfer of experiences among them, improving significantly the collaboration potential when compared to previously proposed file-based or record-based approaches. Three different case studies were conducted to validate the proposed approach, each one focusing one a subset of the contributions of this thesis. The first one uses the proxy-based resource distribution architecture to implement a prototyping platform using reconfigurable hardware modules. The second one extends the foundations of the implemented object-oriented framework to support interface-based design. Such extensions - design representation primitives and tool blocks - are used to implement a design entry tool named IBlaDe, which allows the collaborative creation of functional and structural models of integrated systems. The third case study regards the possibility of integration of multimedia metadata to the design data model. Such possibility is explored in the frame of an online educational and training platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Each plasma physics laboratory has a proprietary scheme to control and data acquisition system. Usually, it is different from one laboratory to another. It means that each laboratory has its own way to control the experiment and retrieving data from the database. Fusion research relies to a great extent on international collaboration and this private system makes it difficult to follow the work remotely. The TCABR data analysis and acquisition system has been upgraded to support a joint research programme using remote participation technologies. The choice of MDSplus (Model Driven System plus) is proved by the fact that it is widely utilized, and the scientists from different institutions may use the same system in different experiments in different tokamaks without the need to know how each system treats its acquisition system and data analysis. Another important point is the fact that the MDSplus has a library system that allows communication between different types of language (JAVA, Fortran, C, C++, Python) and programs such as MATLAB, IDL, OCTAVE. In the case of tokamak TCABR interfaces (object of this paper) between the system already in use and MDSplus were developed, instead of using the MDSplus at all stages, from the control, and data acquisition to the data analysis. This was done in the way to preserve a complex system already in operation and otherwise it would take a long time to migrate. This implementation also allows add new components using the MDSplus fully at all stages. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This project points out a brief overview of several concepts, as Renewable Energy Resources, Distributed Energy Resources, Distributed Generation, and describes the general architecture of an electrical microgrid, isolated or connected to the Medium Voltage Network. Moreover, the project focuses on a project carried out by GRECDH Department in collaboration with CITCEA Department, both belonging to Universitat Politécnica de Catalunya: it concerns isolated microgrids employing renewable energy resources in two communities in northern Peru. Several solutions found using optimization software regarding different generation systems (wind and photovoltaic) and different energy demand scenarios are commented and analyzed from an electrical point of view. Furthermore, there are some proposals to improve microgrid performances, in particular to increase voltage values for each load connected to the microgrid. The extra costs required by the proposed solutions are calculated and their effect on the total microgrid cost are taken into account; finally there are some considerations about the impact the project has on population and on people's daily life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The integration of remote monitoring techniques at different scales is of crucial importance for monitoring of volcanoes and assessment of the associated hazard. In this optic, technological advancement and collaboration between research groups also play a key role. Vhub is a community cyberinfrastructure platform designed for collaboration in volcanology research. Within the Vhub framework, this dissertation focuses on two research themes, both representing novel applications of remotely sensed data in volcanology: advancement in the acquisition of topographic data via active techniques and application of passive multi-spectral satellite data to monitoring of vegetated volcanoes. Measuring surface deformation is a critical issue in analogue modelling of Earth science phenomena. I present a novel application of the Microsoft Kinect sensor to measurement of vertical and horizontal displacements in analogue models. Specifically, I quantified vertical displacement in a scaled analogue model of Nisyros volcano, Greece, simulating magmatic deflation and inflation and related surface deformation, and included the horizontal component to reconstruct 3D models of pit crater formation. The detection of active faults around volcanoes is of importance for seismic and volcanic hazard assessment, but not a simple task to be achieved using analogue models. I present new evidence of neotectonic deformation along a north-south trending fault from the Mt Shasta debris avalanche deposit (DAD), northern California. The fault was identified on an airborne LiDAR campaign of part of the region interested by the DAD and then confirmed in the field. High resolution LiDAR can be utilized also for geomorphological assessment of DADs, and I describe a size-distance analysis to document geomorphological aspects of hummock in the Shasta DAD. Relating the remote observations of volcanic passive degassing to conditions and impacts on the ground provides an increased understanding of volcanic degassing and how satellite-based monitoring can be used to inform hazard management strategies in nearreal time. Combining a variety of satellite-based spectral time series I aim to perform the first space-based assessment of the impacts of sulfur dioxide emissions from Turrialba volcano, Costa Rica, on vegetation in the surrounding environment, and establish whether vegetation indices could be used more broadly to detect volcanic unrest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mt Etna's activity has increased during the last decade with a tendency towards more explosive eruptions that produce paroxysmal lava fountains. From January 2011 to April 2012, 25 lava fountaining episodes took place at Etna's New South-East Crater (NSEC). Improved understanding of the mechanism driving these explosive basaltic eruptions is needed to reduce volcanic hazards. This type of activity produces high sulfur dioxide (SO2) emissions, associated with lava flows and ash fall-out, but to date the SO2 emissions associated with Etna's lava fountains have been poorly constrained. The Ultraviolet (UV) Ozone Monitoring Instrument (OMI) on NASA's Aura satellite and the Atmospheric Infrared Sounder (AIRS) on Aqua were used to measure the SO2 loadings. Ground-based data from the Observatoire de Physique du Globe de Clermont-Ferrand (OPGC) L-band Doppler radar, VOLDORAD 2B, used in collaboration with the Italian National Institute of Geophysics and Volcanology in Catania (INGV-CT), also detected the associated ash plumes, giving precise timing and duration for the lava fountains. This study resulted in the first detailed analysis of the OMI and AIRS SO2 data for Etna's lava fountains during the 2011-2012 eruptive cycle. The HYSPLIT trajectory model is used to constrain the altitude of the observed SO2 clouds, and results show that the SO2 emission usually coincided with the lava fountain peak intensity as detected by VOLDORAD. The UV OMI and IR AIRS SO2 retrievals permit quantification of the SO2 loss rate in the volcanic SO2 clouds, many of which were tracked for several days after emission. A first attempt to quantitatively validate AIRS SO2 retrievals with OMI data revealed a good correlation for high altitude SO2 clouds. Using estimates of the emitted SO2 at the time each paroxysm, we observe a correlation with the inter-paroxysm repose time. We therefore suggest that our data set supports the collapsing foam (CF) model [1] as driving mechanism for the paroxysmal events at the NSEC. Using VOLDORAD-based estimates of the erupted magma mass, we observe a large excess of SO2 in the eruption clouds. Satellite measurements indicate that SO2 emissions from Etnean lava fountains can reach the lower stratosphere and hence could pose a hazard to aviation. [1] Parfitt E.A (2004). A discussion of the mechanisms of explosive basaltic eruptions. J. Volcanol. Geotherm. Res. 134, 77-107.