953 resultados para Respiration, Artificial [methods]


Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the increasing number of XML documents in varied domains, it has become essential to identify ways of finding interesting information from these documents. Data mining techniques were used to derive this interesting information. Mining on XML documents is impacted by its model due to the semi-structured nature of these documents. Hence, in this chapter we present an overview of the various models of XML documents, how these models were used for mining and some of the issues and challenges in these models. In addition, this chapter also provides some insights into the future models of XML documents for effectively capturing the two important features namely structure and content of XML documents for mining.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a new research method, Participatory Action Design Research (PADR), for studies in the Urban Informatics domain. PADR supports Urban Informatics research in developing new technological means (e.g. using mobile and ubiquitous computing) to resolve contemporary issues or support everyday life in urban environments. The paper discusses the nature, aims and inherent methodological needs of Urban Informatics research, and proposes PADR as a method to address these needs. Situated in a socio-technical context, Urban Informatics requires a close dialogue between social and design-oriented fields of research as well as their methods. PADR combines Action Research and Design Science Research, both of which are used in Information Systems, another field with a strong socio-technical emphasis, and further adapts them to the cross-disciplinary needs and research context of Urban Informatics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent algorithms for monocular motion capture (MoCap) estimate weak-perspective camera matrices between images using a small subset of approximately-rigid points on the human body (i.e. the torso and hip). A problem with this approach, however, is that these points are often close to coplanar, causing canonical linear factorisation algorithms for rigid structure from motion (SFM) to become extremely sensitive to noise. In this paper, we propose an alternative solution to weak-perspective SFM based on a convex relaxation of graph rigidity. We demonstrate the success of our algorithm on both synthetic and real world data, allowing for much improved solutions to marker less MoCap problems on human bodies. Finally, we propose an approach to solve the two-fold ambiguity over bone direction using a k-nearest neighbour kernel density estimator.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioinformatics involves analyses of biological data such as DNA sequences, microarrays and protein-protein interaction (PPI) networks. Its two main objectives are the identification of genes or proteins and the prediction of their functions. Biological data often contain uncertain and imprecise information. Fuzzy theory provides useful tools to deal with this type of information, hence has played an important role in analyses of biological data. In this thesis, we aim to develop some new fuzzy techniques and apply them on DNA microarrays and PPI networks. We will focus on three problems: (1) clustering of microarrays; (2) identification of disease-associated genes in microarrays; and (3) identification of protein complexes in PPI networks. The first part of the thesis aims to detect, by the fuzzy C-means (FCM) method, clustering structures in DNA microarrays corrupted by noise. Because of the presence of noise, some clustering structures found in random data may not have any biological significance. In this part, we propose to combine the FCM with the empirical mode decomposition (EMD) for clustering microarray data. The purpose of EMD is to reduce, preferably to remove, the effect of noise, resulting in what is known as denoised data. We call this method the fuzzy C-means method with empirical mode decomposition (FCM-EMD). We applied this method on yeast and serum microarrays, and the silhouette values are used for assessment of the quality of clustering. The results indicate that the clustering structures of denoised data are more reasonable, implying that genes have tighter association with their clusters. Furthermore we found that the estimation of the fuzzy parameter m, which is a difficult step, can be avoided to some extent by analysing denoised microarray data. The second part aims to identify disease-associated genes from DNA microarray data which are generated under different conditions, e.g., patients and normal people. We developed a type-2 fuzzy membership (FM) function for identification of diseaseassociated genes. This approach is applied to diabetes and lung cancer data, and a comparison with the original FM test was carried out. Among the ten best-ranked genes of diabetes identified by the type-2 FM test, seven genes have been confirmed as diabetes-associated genes according to gene description information in Gene Bank and the published literature. An additional gene is further identified. Among the ten best-ranked genes identified in lung cancer data, seven are confirmed that they are associated with lung cancer or its treatment. The type-2 FM-d values are significantly different, which makes the identifications more convincing than the original FM test. The third part of the thesis aims to identify protein complexes in large interaction networks. Identification of protein complexes is crucial to understand the principles of cellular organisation and to predict protein functions. In this part, we proposed a novel method which combines the fuzzy clustering method and interaction probability to identify the overlapping and non-overlapping community structures in PPI networks, then to detect protein complexes in these sub-networks. Our method is based on both the fuzzy relation model and the graph model. We applied the method on several PPI networks and compared with a popular protein complex identification method, the clique percolation method. For the same data, we detected more protein complexes. We also applied our method on two social networks. The results showed our method works well for detecting sub-networks and give a reasonable understanding of these communities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This position paper provides an overview of work conducted and an outlook of future directions within the field of Information Retrieval (IR) that aims to develop novel models, methods and frameworks inspired by Quantum Theory (QT).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Backgrounds Whether suicide in China has significant seasonal variations is unclear. The aim of this study is to examine the seasonality of suicide in Shandong China and to assess the associations of suicide seasonality with gender, residence, age and methods of suicide. Methods Three types of tests (Chi-square, Edwards' T and Roger's Log method) were used to detect the seasonality of the suicide data extracted from the official mortality data of Shandong Disease Surveillance Point (DSP) system. Peak/low ratios (PLRs) and 95% confidence intervals (CIs) were calculated to indicate the magnitude of seasonality. Results A statistically significant seasonality with a single peak in suicide rates in spring and early summer, and a dip in winter was observed, which remained relatively consistent over years. Regardless of gender, suicide seasonality was more pronounced in rural areas, younger age groups and for non-violent methods, in particular, self-poisoning by pesticide. Conclusions There are statistically significant seasonal variations of completed suicide for both men and women in Shandong, China. Differences exist between residence (urban/rural), age groups and suicide methods. Results appear to support a sociological explanation of suicide seasonality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Damage detection in structures has become increasingly important in recent years. While a number of damage detection and localization methods have been proposed, very few attempts have been made to explore the structure damage with noise polluted data which is unavoidable effect in real world. The measurement data are contaminated by noise because of test environment as well as electronic devices and this noise tend to give error results with structural damage identification methods. Therefore it is important to investigate a method which can perform better with noise polluted data. This paper introduces a new damage index using principal component analysis (PCA) for damage detection of building structures being able to accept noise polluted frequency response functions (FRFs) as input. The FRF data are obtained from the function datagen of MATLAB program which is available on the web site of the IASC-ASCE (International Association for Structural Control– American Society of Civil Engineers) Structural Health Monitoring (SHM) Task Group. The proposed method involves a five-stage process: calculation of FRFs, calculation of damage index values using proposed algorithm, development of the artificial neural networks and introducing damage indices as input parameters and damage detection of the structure. This paper briefly describes the methodology and the results obtained in detecting damage in all six cases of the benchmark study with different noise levels. The proposed method is applied to a benchmark problem sponsored by the IASC-ASCE Task Group on Structural Health Monitoring, which was developed in order to facilitate the comparison of various damage identification methods. The illustrated results show that the PCA-based algorithm is effective for structural health monitoring with noise polluted FRFs which is of common occurrence when dealing with industrial structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Under pressure from both the ever increasing level of market competition and the global financial crisis, clients in consumer electronics (CE) industry are keen to understand how to choose the most appropriate procurement method and hence to improve their competitiveness. Four rounds of Delphi questionnaire survey were conducted with 12 experts in order to identify the most appropriate procurement method in the Hong Kong CE industry. Five key selection criteria in the CE industry are highlighted, including product quality, capability, price competition, flexibility and speed. This study also revealed that product quality was found to be the most important criteria for the “First type used commercially” andMajor functional improvements” projects. As for “Minor functional improvements” projects, price competition was the most crucial factor to be considered during the PP selection. These research findings provide owners with useful insights to select the procurement strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Research Interests: Are parents complying with the legislation? Is this the same for urban, regional and rural parents? Indigenous parents? What difficulties do parents experience in complying? Do parents understand why the legislation was put in place? Have there been negative consequences for other organisations or sectors of the community?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent studies have started to explore context-awareness as a driver in the design of adaptable business processes. The emerging challenge of identifying and considering contextual drivers in the environment of a business process are well understood, however, typical methods used in business process modeling do not yet consider this additional contextual information in their process designs. In this chapter, we describe our research towards innovative and advanced process modeling methods that include mechanisms to incorporate relevant contextual drivers and their impacts on business processes in process design models. We report on our ongoing work with an Australian insurance provider and describe the design science we employed to develop these innovative and useful artifacts as part of a context-aware method framework. We discuss the utility of these artifacts in an application in the claims handling process at the case organization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A central topic in economics is the existence of social preferences. Behavioural economics in general has approached the issue from several angles. Controlled experimental settings, surveys, and field experiments are able to show that in a number of economic environments, people usually care about immaterial things such as fairness or equity of allocations. Findings from experimental economics specifically have lead to large increase in theories addressing social preferences. Most (pro)social phenomena are well understood in the experimental settings but very difficult to observe 'in the wild'. One criticism in this regard is that many findings are bound by the artificial environment of the computer lab or survey method used. A further criticism is that the traditional methods also fail to directly attribute the observed behaviour to the mental constructs that are expected to stand behind them. This thesis will first examine the usefulness of sports data to test social preference models in a field environment, thus overcoming limitations of the lab with regards to applicability to other - non-artificial - environments. The second major contribution of this research establishes a new neuroscientific tool - the measurement of the heart rate variability - to observe participants' emotional reactions in a traditional experimental setup.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews the current state in the application of infrared methods, particularly mid-infrared (mid-IR) and near infrared (NIR), for the evaluation of the structural and functional integrity of articular cartilage. It is noted that while a considerable amount of research has been conducted with respect to tissue characterization using mid-IR, it is almost certain that full-thickness cartilage assessment is not feasible with this method. On the contrary, the relatively more considerable penetration capacity of NIR suggests that it is a suitable candidate for full-thickness cartilage evaluation. Nevertheless, significant research is still required to improve the specificity and clinical applicability of the method if we are going to be able to use it for distinguishing between functional and dysfunctional cartilage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The surface amorphous layer of articular cartilage is of primary importance to its load-bearing and lubrication function. This lipid-filled layer is degraded/disrupted or eliminated when cartilage degenerates due to diseases. This article examines further the characteristic of this surface overlay using a combination of microscopy and imaging methods to evaluate the hypothesis that the surface of articular cartilage can be repaired by exposing degraded cartilage to aqueous synthetic lipid mixtures. The preliminary results demonstrate that it is possible to create a new surface layer of phospholipids on the surface of cartilage following artificial lipid removal, but such a layer does not possess enough mechanical strength for physiological function when created with either unsaturated palmitoyloleoyl- phosphatidylcholine or saturated dipalmitoyl-phosphatidylcholine component of joint lipid composition alone. We conclude that this may be due to low structural cohesivity, inadequate time of exposure, and the mix/content of lipid in the incubation environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To compare accuracies of different methods for calculating human lens power when lens thickness is not available. Methods: Lens power was calculated by four methods. Three methods were used with previously published biometry and refraction data of 184 emmetropic and myopic eyes of 184 subjects (age range [18, 63] years, spherical equivalent range [–12.38, +0.75] D). These three methods consist of the Bennett method, which uses lens thickness, our modification of the Stenstm method and the Bennett¬Rabbetts method, both of which do not require knowledge of lens thickness. These methods include c constants, which represent distances from lens surfaces to principal planes. Lens powers calculated with these methods were compared with those calculated using phakometry data available for a subgroup of 66 emmetropic eyes (66 subjects). Results: Lens powers obtained from the Bennett method corresponded well with those obtained by phakometry for emmetropic eyes, although individual differences up to 3.5D occurred. Lens powers obtained from the modified¬Stenstm and Bennett¬Rabbetts methods deviated significantly from those obtained with either the Bennett method or phakometry. Customizing the c constants improved this agreement, but applying these constants to the entire group gave mean lens power differences of 0.71 ± 0.56D compared with the Bennett method. By further optimizing the c constants, the agreement with the Bennett method was within ± 1D for 95% of the eyes. Conclusion: With appropriate constants, the modified¬Stenstm and Bennett¬Rabbetts methods provide a good approximation of the Bennett lens power in emmetropic and myopic eyes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital human modelling (DHM) has today matured from research into industrial application. In the automotive domain, DHM has become a commonly used tool in virtual prototyping and human-centred product design. While this generation of DHM supports the ergonomic evaluation of new vehicle design during early design stages of the product, by modelling anthropometry, posture, motion or predicting discomfort, the future of DHM will be dominated by CAE methods, realistic 3D design, and musculoskeletal and soft tissue modelling down to the micro-scale of molecular activity within single muscle fibres. As a driving force for DHM development, the automotive industry has traditionally used human models in the manufacturing sector (production ergonomics, e.g. assembly) and the engineering sector (product ergonomics, e.g. safety, packaging). In product ergonomics applications, DHM share many common characteristics, creating a unique subset of DHM. These models are optimised for a seated posture, interface to a vehicle seat through standardised methods and provide linkages to vehicle controls. As a tool, they need to interface with other analytic instruments and integrate into complex CAD/CAE environments. Important aspects of current DHM research are functional analysis, model integration and task simulation. Digital (virtual, analytic) prototypes or digital mock-ups (DMU) provide expanded support for testing and verification and consider task-dependent performance and motion. Beyond rigid body mechanics, soft tissue modelling is evolving to become standard in future DHM. When addressing advanced issues beyond the physical domain, for example anthropometry and biomechanics, modelling of human behaviours and skills is also integrated into DHM. Latest developments include a more comprehensive approach through implementing perceptual, cognitive and performance models, representing human behaviour on a non-physiologic level. Through integration of algorithms from the artificial intelligence domain, a vision of the virtual human is emerging.