767 resultados para Collaborative Filtering
Resumo:
Large component-based systems are often built from many of the same components. As individual component-based software systems are developed, tested and maintained, these shared components are repeatedly manipulated. As a result there are often significant overlaps and synergies across and among the different test efforts of different component-based systems. However, in practice, testers of different systems rarely collaborate, taking a test-all-by-yourself approach. As a result, redundant effort is spent testing common components, and important information that could be used to improve testing quality is lost. The goal of this research is to demonstrate that, if done properly, testers of shared software components can save effort by avoiding redundant work, and can improve the test effectiveness for each component as well as for each component-based software system by using information obtained when testing across multiple components. To achieve this goal I have developed collaborative testing techniques and tools for developers and testers of component-based systems with shared components, applied the techniques to subject systems, and evaluated the cost and effectiveness of applying the techniques. The dissertation research is organized in three parts. First, I investigated current testing practices for component-based software systems to find the testing overlap and synergy we conjectured exists. Second, I designed and implemented infrastructure and related tools to facilitate communication and data sharing between testers. Third, I designed two testing processes to implement different collaborative testing algorithms and applied them to large actively developed software systems. This dissertation has shown the benefits of collaborative testing across component developers who share their components. With collaborative testing, researchers can design algorithms and tools to support collaboration processes, achieve better efficiency in testing configurations, and discover inter-component compatibility faults within a minimal time window after they are introduced.
Resumo:
Stand-alone and networked surgical virtual reality based simulators have been proposed as means to train surgical skills with or without a supervisor nearby the student or trainee -- However, surgical skills teaching in medicine schools and hospitals is changing, requiring the development of new tools to focus on: (i) importance of mentors role, (ii) teamwork skills and (iii) remote training support -- For these reasons, a surgical simulator should not only allow the training involving a student and an instructor that are located remotely, but also the collaborative training of users adopting different medical roles during the training sesión -- Collaborative Networked Virtual Surgical Simulators (CNVSS) allow collaborative training of surgical procedures where remotely located users with different surgical roles can take part in the training session -- To provide successful training involving good collaborative performance, CNVSS should handle heterogeneity factors such as users’ machine capabilities and network conditions, among others -- Several systems for collaborative training of surgical procedures have been developed as research projects -- To the best of our knowledge none has focused on handling heterogeneity in CNVSS -- Handling heterogeneity in this type of collaborative sessions is important because not all remotely located users have homogeneous internet connections, nor the same interaction devices and displays, nor the same computational resources, among other factors -- Additionally, if heterogeneity is not handled properly, it will have an adverse impact on the performance of each user during the collaborative sesión -- In this document, the development of a context-aware architecture for collaborative networked virtual surgical simulators, in order to handle the heterogeneity involved in the collaboration session, is proposed -- To achieve this, the following main contributions are accomplished in this thesis: (i) Which and how infrastructure heterogeneity factors affect the collaboration of two users performing a virtual surgical procedure were determined and analyzed through a set of experiments involving users collaborating, (ii) a context-aware software architecture for a CNVSS was proposed and implemented -- The architecture handles heterogeneity factors affecting collaboration, applying various adaptation mechanisms and finally, (iii) A mechanism for handling heterogeneity factors involved in a CNVSS is described, implemented and validated in a set of testing scenarios
Resumo:
This dissertation examines the intersections between difference, participation, and planning processes. Rooted in scholarly conversations about deliberative democracy, collaborative planning, and nonprofit organizations in civil society, this research considers how planning practitioners can better plan across difference. Through case study research, this dissertation examines a collaborative planning process conducted by a nonprofit organization. Unlike more conventional participatory planning processes, the organization utilized scenario planning. Exercising their position in civil society, participation in the process was not open to all community members and the organization carefully selected a diverse set of participants. Findings from this research project indicate that this process, by moving away from a strict definition of rational discourse, focusing on multiple futures as opposed to a single, utopian future, and deliberately bringing together a broad cross-section of community members allowed for participants to speak freely and learn from one another’s perspectives and experiences. Experiences of process participants also demonstrate the degree to which cultural backgrounds shape participation in and expectations of planning processes. While there remains no clear answer in how to represent and respond to cultural differences in planning processes, the experiences of the organization, program staff, and community participants help scholars and practitioners move closer to planning across differences.
Resumo:
French Impressionism is a term which is often used in discussing music originating in France towards the end of the nineteenth century. The term Spanish Impressionism could also be used when discussing Spanish music written by the Spanish composers who studied and worked in Paris at the same time as their French counterparts. After all, Spanish music written during this time exhibits many of the same characteristics and aesthetics as French music of the same era. This dissertation will focus on the French and Spanish composers writing during that exciting time. Musical impressionism emphasizes harmonic effects and rhythmic fluidity in the pursuit of evocative moods, sound pictures of nature or places over the formalism of structure and thematic concerns. The music of this time is highly virtuosic as well as musically demanding, since many of the composers were brilliant pianists. My three dissertation recitals concentrated on works which exhibited the many facets of impressionism as well as the technical and musical challenges. The repertoire included selections by Spanish composers Manuel de Falla, Isaac Albéniz, Enrique Granados, Joaquín Turina, and Joaquín Rodrigo and French composers Claude Debussy and Maurice Ravel. The recitals were on April 30, 2013, February 23, 2014 and October 11, 2015. They included solo piano works by Granados and Albéniz, vocal works by Debussy, Ravel, de Falla, Turina and Rodrigo, piano trios by Granados and Turina, instrumental duos by Debussy, Ravel and de Falla, and a two-piano work of Debussy transcribed by Ravel. All three recitals were held in Gildenhorn Recital Hall at the University of Maryland and copies of this dissertation and recordings of each recital may be found through the Digital Repository at the University of Maryland (DRUM).
Resumo:
We have designed this flowchart to help you choose the web filtering option that best suits your needs from three different options: Our free standard web filtering service, enhanced user based filtering or a solution from our framework agreement.
Resumo:
This dissertation presents a case study of collaborative research through design with Floracaching, a gamified mobile application for citizen science biodiversity data collection. One contribution of this study is the articulation of collaborative research through design (CRtD), an approach that blends cooperative design approaches with the research through design methodology (RtD). Collaborative research through design is thus defined as an iterative process of cooperative design, where the collaborative vision of an ideal state is embedded in a design. Applying collaborative research through design with Floracaching illustrates how a number of cooperative techniques—especially contextual inquiry, prototyping, and focus groups—may be applied in a research through design setting. Four suggestions for collaborative research through design (recruit from a range of relevant backgrounds; take flexibility as a goal; enable independence and agency; and, choose techniques that support agreement or consensus) are offered to help others who wish to experiment with this new approach. Applying collaborative research through design to Floracaching yielded a new prototype of the application, accompanied by design annotations in the form of framing constructs for designing to support mobile, place-based citizen science activities. The prototype and framing constructs, which may inform other designers of similar citizen science technologies, are a second contribution of this research.
Resumo:
The spike-diffuse-spike (SDS) model describes a passive dendritic tree with active dendritic spines. Spine-head dynamics is modeled with a simple integrate-and-fire process, whilst communication between spines is mediated by the cable equation. In this paper we develop a computational framework that allows the study of multiple spiking events in a network of such spines embedded on a simple one-dimensional cable. In the first instance this system is shown to support saltatory waves with the same qualitative features as those observed in a model with Hodgkin-Huxley kinetics in the spine-head. Moreover, there is excellent agreement with the analytically calculated speed for a solitary saltatory pulse. Upon driving the system with time varying external input we find that the distribution of spines can play a crucial role in determining spatio-temporal filtering properties. In particular, the SDS model in response to periodic pulse train shows a positive correlation between spine density and low-pass temporal filtering that is consistent with the experimental results of Rose and Fortune [1999, Mechanisms for generating temporal filters in the electrosensory system. The Journal of Experimental Biology 202, 1281-1289]. Further, we demonstrate the robustness of observed wave properties to natural sources of noise that arise both in the cable and the spine-head, and highlight the possibility of purely noise induced waves and coherent oscillations.
Resumo:
One challenge on data assimilation (DA) methods is how the error covariance for the model state is computed. Ensemble methods have been proposed for producing error covariance estimates, as error is propagated in time using the non-linear model. Variational methods, on the other hand, use the concepts of control theory, whereby the state estimate is optimized from both the background and the measurements. Numerical optimization schemes are applied which solve the problem of memory storage and huge matrix inversion needed by classical Kalman filter methods. Variational Ensemble Kalman filter (VEnKF), as a method inspired the Variational Kalman Filter (VKF), enjoys the benefits from both ensemble methods and variational methods. It avoids filter inbreeding problems which emerge when the ensemble spread underestimates the true error covariance. In VEnKF this is tackled by resampling the ensemble every time measurements are available. One advantage of VEnKF over VKF is that it needs neither tangent linear code nor adjoint code. In this thesis, VEnKF has been applied to a two-dimensional shallow water model simulating a dam-break experiment. The model is a public code with water height measurements recorded in seven stations along the 21:2 m long 1:4 m wide flume’s mid-line. Because the data were too sparse to assimilate the 30 171 model state vector, we chose to interpolate the data both in time and in space. The results of the assimilation were compared with that of a pure simulation. We have found that the results revealed by the VEnKF were more realistic, without numerical artifacts present in the pure simulation. Creating a wrapper code for a model and DA scheme might be challenging, especially when the two were designed independently or are poorly documented. In this thesis we have presented a non-intrusive approach of coupling the model and a DA scheme. An external program is used to send and receive information between the model and DA procedure using files. The advantage of this method is that the model code changes needed are minimal, only a few lines which facilitate input and output. Apart from being simple to coupling, the approach can be employed even if the two were written in different programming languages, because the communication is not through code. The non-intrusive approach is made to accommodate parallel computing by just telling the control program to wait until all the processes have ended before the DA procedure is invoked. It is worth mentioning the overhead increase caused by the approach, as at every assimilation cycle both the model and the DA procedure have to be initialized. Nonetheless, the method can be an ideal approach for a benchmark platform in testing DA methods. The non-intrusive VEnKF has been applied to a multi-purpose hydrodynamic model COHERENS to assimilate Total Suspended Matter (TSM) in lake Säkylän Pyhäjärvi. The lake has an area of 154 km2 with an average depth of 5:4 m. Turbidity and chlorophyll-a concentrations from MERIS satellite images for 7 days between May 16 and July 6 2009 were available. The effect of the organic matter has been computationally eliminated to obtain TSM data. Because of computational demands from both COHERENS and VEnKF, we have chosen to use 1 km grid resolution. The results of the VEnKF have been compared with the measurements recorded at an automatic station located at the North-Western part of the lake. However, due to TSM data sparsity in both time and space, it could not be well matched. The use of multiple automatic stations with real time data is important to elude the time sparsity problem. With DA, this will help in better understanding the environmental hazard variables for instance. We have found that using a very high ensemble size does not necessarily improve the results, because there is a limit whereby additional ensemble members add very little to the performance. Successful implementation of the non-intrusive VEnKF and the ensemble size limit for performance leads to an emerging area of Reduced Order Modeling (ROM). To save computational resources, running full-blown model in ROM is avoided. When the ROM is applied with the non-intrusive DA approach, it might result in a cheaper algorithm that will relax computation challenges existing in the field of modelling and DA.