718 resultados para mapping method
Resumo:
Project work can involve multiple people from varying disciplines coming together to solve problems as a group. Large scale interactive displays are presenting new opportunities to support such interactions with interactive and semantically enabled cooperative work tools such as intelligent mind maps. In this paper, we present a novel digital, touch-enabled mind-mapping tool as a first step towards achieving such a vision. This first prototype allows an evaluation of the benefits of a digital environment for a task that would otherwise be performed on paper or flat interactive surfaces. Observations and surveys of 12 participants in 3 groups allowed the formulation of several recommendations for further research into: new methods for capturing text input on touch screens; inclusion of complex structures; multi-user environments and how users make the shift from single- user applications; and how best to navigate large screen real estate in a touch-enabled, co-present multi-user setting.
Resumo:
A fractional FitzHugh–Nagumo monodomain model with zero Dirichlet boundary conditions is presented, generalising the standard monodomain model that describes the propagation of the electrical potential in heterogeneous cardiac tissue. The model consists of a coupled fractional Riesz space nonlinear reaction-diffusion model and a system of ordinary differential equations, describing the ionic fluxes as a function of the membrane potential. We solve this model by decoupling the space-fractional partial differential equation and the system of ordinary differential equations at each time step. Thus, this means treating the fractional Riesz space nonlinear reaction-diffusion model as if the nonlinear source term is only locally Lipschitz. The fractional Riesz space nonlinear reaction-diffusion model is solved using an implicit numerical method with the shifted Grunwald–Letnikov approximation, and the stability and convergence are discussed in detail in the context of the local Lipschitz property. Some numerical examples are given to show the consistency of our computational approach.
Resumo:
The application of robotics to protein crystallization trials has resulted in the production of millions of images. Manual inspection of these images to find crystals and other interesting outcomes is a major rate-limiting step. As a result there has been intense activity in developing automated algorithms to analyse these images. The very first step for most systems that have been described in the literature is to delineate each droplet. Here, a novel approach that reaches over 97% success rate and subsecond processing times is presented. This will form the seed of a new high-throughput system to scrutinize massive crystallization campaigns automatically. © 2010 International Union of Crystallography Printed in Singapore-all rights reserved.
Resumo:
Objectives: To develop a new measure of dysfunctional thoughts for family caregivers of people living with dementia. These thoughts can contribute to negative outcomes, but they may be modifiable. Method: A stepwise process was used to develop the Thoughts Questionnaire, commencing with item generation, concept mapping, and pilot testing in a sample of professional and nonprofessional caregivers of people with dementia (n = 18). Next, an independent sample of 35 family caregivers of people with dementia (30 female; M age = 64.30, standard deviation = 10.65) completed: (a) the Thoughts Questionnaire; (b) an existing measure of dysfunctional thoughts, the Dementia Thoughts Caregivers Questionnaire; and (c) separate validated measures of depressive symptoms, caregiver stress, and coping, respectively. Results: The level of agreement with dysfunctional thought statements from the Dementia Thoughts Caregivers Questionnaire and Thoughts Questionnaire was low. However, a small number of Thoughts Questionnaire statements were strongly endorsed by over 85% of the sample. Both dysfunctional thought measures had adequate reliability, but total scores were not significantly intercorrelated (r = .287, p = .095). Only the Thoughts Questionnaire was significantly, positively correlated with most caregiver stress measures. Thoughts Questionnaire items required a much lower reading level than the Dementia Thoughts Caregivers Questionnaire items. Discussion: This study provides preliminary data on a tool for assessing the negative role-related thoughts that family caregivers of people with dementia may experience. Given that these thoughts are implicated in depression but they may be modified, the capacity to identify dysfunctional thoughts may prove useful in caregiver support programs.
Resumo:
Ambiguity validation as an important procedure of integer ambiguity resolution is to test the correctness of the fixed integer ambiguity of phase measurements before being used for positioning computation. Most existing investigations on ambiguity validation focus on test statistic. How to determine the threshold more reasonably is less understood, although it is one of the most important topics in ambiguity validation. Currently, there are two threshold determination methods in the ambiguity validation procedure: the empirical approach and the fixed failure rate (FF-) approach. The empirical approach is simple but lacks of theoretical basis. The fixed failure rate approach has a rigorous probability theory basis, but it employs a more complicated procedure. This paper focuses on how to determine the threshold easily and reasonably. Both FF-ratio test and FF-difference test are investigated in this research and the extensive simulation results show that the FF-difference test can achieve comparable or even better performance than the well-known FF-ratio test. Another benefit of adopting the FF-difference test is that its threshold can be expressed as a function of integer least-squares (ILS) success rate with specified failure rate tolerance. Thus, a new threshold determination method named threshold function for the FF-difference test is proposed. The threshold function method preserves the fixed failure rate characteristic and is also easy-to-apply. The performance of the threshold function is validated with simulated data. The validation results show that with the threshold function method, the impact of the modelling error on the failure rate is less than 0.08%. Overall, the threshold function for the FF-difference test is a very promising threshold validation method and it makes the FF-approach applicable for the real-time GNSS positioning applications.
Resumo:
This paper describes our participation in the Chinese word segmentation task of CIPS-SIGHAN 2010. We implemented an n-gram mutual information (NGMI) based segmentation algorithm with the mixed-up features from unsupervised, supervised and dictionarybased segmentation methods. This algorithm is also combined with a simple strategy for out-of-vocabulary (OOV) word recognition. The evaluation for both open and closed training shows encouraging results of our system. The results for OOV word recognition in closed training evaluation were however found unsatisfactory.
Resumo:
Nowadays, integration of small-scale electricity generators, known as Distributed Generation (DG), into distribution networks has become increasingly popular. This tendency together with the falling price of DG units has a great potential in giving the DG a better chance to participate in voltage regulation process, in parallel with other regulating devices already available in the distribution systems. The voltage control issue turns out to be a very challenging problem for distribution engineers, since existing control coordination schemes need to be reconsidered to take into account the DG operation. In this paper, a control coordination approach is proposed, which is able to utilize the ability of the DG as a voltage regulator, and at the same time minimize the interaction of DG with another DG or other active devices, such as On-load Tap Changing Transformer (OLTC). The proposed technique has been developed based on the concepts of protection principles (magnitude grading and time grading) for response coordination of DG and other regulating devices and uses Advanced Line Drop Compensators (ALDCs) for implementation. A distribution feeder with tap changing transformer and DG units has been extracted from a practical system to test the proposed control technique. The results show that the proposed method provides an effective solution for coordination of DG with another DG or voltage regulating devices and the integration of protection principles has considerably reduced the control interaction to achieve the desired voltage correction.
Resumo:
With the increasing availability of high quality digital cameras that are easily operated by the non-professional photographer, the utility of using digital images to assess endpoints in clinical research of skin lesions has growing acceptance. However, rigorous protocols and description of experiences for digital image collection and assessment are not readily available, particularly for research conducted in remote settings. We describe the development and evaluation of a protocol for digital image collection by the non-professional photographer in a remote setting research trial, together with a novel methodology for assessment of clinical outcomes by an expert panel blinded to treatment allocation.
Resumo:
Due to the availability of huge number of web services, finding an appropriate Web service according to the requirements of a service consumer is still a challenge. Moreover, sometimes a single web service is unable to fully satisfy the requirements of the service consumer. In such cases, combinations of multiple inter-related web services can be utilised. This paper proposes a method that first utilises a semantic kernel model to find related services and then models these related Web services as nodes of a graph. An all-pair shortest-path algorithm is applied to find the best compositions of Web services that are semantically related to the service consumer requirement. The recommendation of individual and composite Web services composition for a service request is finally made. Empirical evaluation confirms that the proposed method significantly improves the accuracy of service discovery in comparison to traditional keyword-based discovery methods.
Resumo:
This paper explores the concept that individual dancers leave traces in a choreographer’s body of work and similarly, that dancers carry forward residue of embodied choreographies into other working processes. This presentation will be grounded in a study of the multiple iterations of a programme of solo works commissioned in 2008 from choreographers John Jasperse, Jodi Melnick, Liz Roche and Rosemary Butcher and danced by the author. This includes an exploration of the development by John Jasperse of themes from his solo into the pieces PURE (2008) and Truth, Revised Histories, Wishful Thinking and Flat Out Lies (2009); an adaptation of the solo Business of the Bloom by Jodi Melnick in 2008 and a further adaptation of Business of the Bloom by this author in 2012. It will map some of the developments that occurred through a number of further performances over five years of the solo Shared Material on Dying by Liz Roche and the working process of the (uncompleted) solo Episodes of Flight by Rosemary Butcher. The purpose is to reflect back on authorship in dance, an art form in which lineages of influence can often be clearly observed. Normally, once a choreographic work is created and performed, it is archived through video recording, notation and/or reviews. The dancer is no longer called upon to represent the dance piece within the archive and thus her/his lived presence and experiential perspective disappears. The author will draw on the different traces still inhabiting her body as pathways towards understanding how choreographic movement circulates beyond this moment of performance. This will include the interrogation of ownership of choreographic movement, as once it becomes integrated in the body of the dancer, who owns the dance? Furthermore, certain dancers, through their individual physical characteristics and moving identities, can deeply influence the formation of choreographic signatures, a proposition that challenges the sole authorship role of the choreographer in dance production. This paper will be delivered in a presentation format that will bleed into movement demonstrations alongside video footage of the works and auto-ethnographic accounts of dancing experience. A further source of knowledge will be drawn from extracts of interviews with other dancers including Sara Rudner, Rebecca Hilton and Catherine Bennett.
Resumo:
We identified, mapped, and characterized a widespread area (gt;1,020 km2) of patterned ground in the Saginaw Lowlands of Michigan, a wet, flat plain composed of waterlain tills, lacustrine deposits, or both. The polygonal patterned ground is interpreted as a possible relict permafrost feature, formed in the Late Wisconsin when this area was proximal to the Laurentide ice sheet. Cold-air drainage off the ice sheet might have pooled in the Saginaw Lowlands, which sloped toward the ice margin, possibly creating widespread but short-lived permafrost on this glacial lake plain. The majority of the polygons occur between the Glacial Lake Warren strandline (~14.8 cal. ka) and the shoreline of Glacial Lake Elkton (~14.3 cal. ka), providing a relative age bracket for the patterned ground. Most of the polygons formed in dense, wet, silt loam soils on flat-lying sites and take the form of reticulate nets with polygon long axes of 150 to 160 m and short axes of 60 to 90 m. Interpolygon swales, often shown as dark curvilinears on aerial photographs, are typically slightly lower than are the polygon centers they bound. Some portions of these interpolygon swales are infilled with gravel-free, sandy loam sediments. The subtle morphology and sedimentological characteristics of the patterned ground in the Saginaw Lowlands suggest that thermokarst erosion, rather than ice-wedge replacement, was the dominant geomorphic process associated with the degradation of the Late-Wisconsin permafrost in the study area and, therefore, was primarily responsible for the soil patterns seen there today.
Resumo:
Primary objective: To investigate whether assessment method influences the type of post-concussion-like symptoms. Methods and procedures: Participants were 73 Australian undergraduate students (Mage = 24.14, SD = 8.84; 75.3% female) with no history of mild traumatic brain injury (mTBI). Participants reported symptoms experienced over the previous 2 weeks in response to an open-ended question (free report), mock interview and standardized checklist (British Columbia Post-concussion Symptom Inventory; BC-PSI). Main outcomes and results: In the free report and checklist conditions, cognitive symptoms were reported significantly less frequently than affective (free report: p < 0.001; checklist: p < 0.001) or somatic symptoms (free report: p < 0.001; checklist: p = 0.004). However, in the mock structured interview condition, cognitive and somatic symptoms were reported significantly less frequently than affective symptoms (both p < 0.001). No participants reported at least one symptom from all three domains when assessed by free report, whereas most participants did so when symptoms were assessed by a mock structured interview (75%) or checklist (90%). Conclusions: Previous studies have shown that the method used to assess symptoms affects the number reported. This study shows that the assessment method also affects the type of reported symptoms.
Resumo:
Structural damage detection using measured dynamic data for pattern recognition is a promising approach. These pattern recognition techniques utilize artificial neural networks and genetic algorithm to match pattern features. In this study, an artificial neural network–based damage detection method using frequency response functions is presented, which can effectively detect nonlinear damages for a given level of excitation. The main objective of this article is to present a feasible method for structural vibration–based health monitoring, which reduces the dimension of the initial frequency response function data and transforms it into new damage indices and employs artificial neural network method for detecting different levels of nonlinearity using recognized damage patterns from the proposed algorithm. Experimental data of the three-story bookshelf structure at Los Alamos National Laboratory are used to validate the proposed method. Results showed that the levels of nonlinear damages can be identified precisely by the developed artificial neural networks. Moreover, it is identified that artificial neural networks trained with summation frequency response functions give higher precise damage detection results compared to the accuracy of artificial neural networks trained with individual frequency response functions. The proposed method is therefore a promising tool for structural assessment in a real structure because it shows reliable results with experimental data for nonlinear damage detection which renders the frequency response function–based method convenient for structural health monitoring.
Resumo:
A precise representation of the spatial distribution of hydrophobicity, hydrophilicity and charges on the molecular surface of proteins is critical for the understanding of the interaction with small molecules and larger systems. The representation of hydrophobicity is rarely done at atom-level, as this property is generally assigned to residues. A new methodology for the derivation of atomic hydrophobicity from any amino acid-based hydrophobicity scale was used to derive 8 sets of atomic hydrophobicities, one of which was used to generate the molecular surfaces for 35 proteins with convex structures, 5 of which, i.e., lysozyme, ribonuclease, hemoglobin, albumin and IgG, have been analyzed in more detail. Sets of the molecular surfaces of the model proteins have been constructed using spherical probes with increasingly large radii, from 1.4 to 20 A˚, followed by the quantification of (i) the surface hydrophobicity; (ii) their respective molecular surface areas, i.e., total, hydrophilic and hydrophobic area; and (iii) their relative densities, i.e., divided by the total molecular area; or specific densities, i.e., divided by property-specific area. Compared with the amino acid-based formalism, the atom-level description reveals molecular surfaces which (i) present an approximately two times more hydrophilic areas; with (ii) less extended, but between 2 to 5 times more intense hydrophilic patches; and (iii) 3 to 20 times more extended hydrophobic areas. The hydrophobic areas are also approximately 2 times more hydrophobicity-intense. This, more pronounced "leopard skin"-like, design of the protein molecular surface has been confirmed by comparing the results for a restricted set of homologous proteins, i.e., hemoglobins diverging by only one residue (Trp37). These results suggest that the representation of hydrophobicity on the protein molecular surfaces at atom-level resolution, coupled with the probing of the molecular surface at different geometric resolutions, can capture processes that are otherwise obscured to the amino acid-based formalism.