986 resultados para Causal tree method


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents two novel concepts to enhance the accuracy of damage detection using the Modal Strain Energy based Damage Index (MSEDI) with the presence of noise in the mode shape data. Firstly, the paper presents a sequential curve fitting technique that reduces the effect of noise on the calculation process of the MSEDI, more effectively than the two commonly used curve fitting techniques; namely, polynomial and Fourier’s series. Secondly, a probability based Generalized Damage Localization Index (GDLI) is proposed as a viable improvement to the damage detection process. The study uses a validated ABAQUS finite-element model of a reinforced concrete beam to obtain mode shape data in the undamaged and damaged states. Noise is simulated by adding three levels of random noise (1%, 3%, and 5%) to the mode shape data. Results show that damage detection is enhanced with increased number of modes and samples used with the GDLI.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditional recommendation methods offer items, that are inanimate and one way recommendation, to users. Emerging new applications such as online dating or job recruitments require reciprocal people-to-people recommendations that are animate and two-way recommendations. In this paper, we propose a reciprocal collaborative method based on the concepts of users' similarities and common neighbors. The dataset employed for the experiment is gathered from a real life online dating network. The proposed method is compared with baseline methods that use traditional collaborative algorithms. Results show the proposed method can achieve noticeably better performance than the baseline methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The modern structural diagnosis process is rely on vibration characteristics to assess safer serviceability level of the structure. This paper examines the potential of change in flexibility method to use in damage detection process and two main practical constraints associated with it. The first constraint addressed in this paper is reduction in number of data acquisition points due to limited number of sensors. Results conclude that accuracy of the change in flexibility method is influenced by the number of data acquisition points/sensor locations in real structures. Secondly, the effect of higher modes on damage detection process has been studied. This addresses the difficulty of extracting higher order modal data with available sensors. Four damage indices have been presented to identify their potential of damage detection with respect to different locations and severity of damage. A simply supported beam with two degrees of freedom at each node is considered only for a single damage cases throughout the paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this study was to investigate whether an approach to developing word lists centred on etymological roots would improve the spelling performance of older primary school students. Participants were 46 students in the last year of primary school in south-east Queensland (31 girls and 15 boys) across three classes, with two classes being assigned to control conditions. Students were evaluated pre- and post-intervention on three dependent measures: British Spelling Test Series spelling, spelling in writing and writing. The results of this intervention revealed improvements in spelling for girls but not for boys. The implications for improved teaching methods are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article sets out the results of an empirical research study into the uses to which the Australian patent system is being put in the early 21st century. The focus of the study is business method patents, which are of interest because they are a controversial class of patent that are thought to differ significantly from the mechanical, chemical and industrial inventions that have traditionally been the mainstay of the patent system. The purpose of the study is to understand what sort of business method patent applications have been lodged in Australia in the first decade of this century and how the patent office is responding to those applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A vertex-centred finite volume method (FVM) for the Cahn-Hilliard (CH) and recently proposed Cahn-Hilliard-reaction (CHR) equations is presented. Information at control volume faces is computed using a high-order least-squares approach based on Taylor series approximations. This least-squares problem explicitly includes the variational boundary condition (VBC) that ensures that the discrete equations satisfy all of the boundary conditions. We use this approach to solve the CH and CHR equations in one and two dimensions and show that our scheme satisfies the VBC to at least second order. For the CH equation we show evidence of conservative, gradient stable solutions, however for the CHR equation, strict gradient-stability is more challenging to achieve.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Breast cancer is a leading contributor to the burden of disease in Australia. Fortunately, the recent introduction of diverse therapeutic strategies have improved the survival outcome for many women. Despite this, the clinical management of breast cancer remains problematic as not all approaches are sufficiently sophisticated to take into account the heterogeneity of this disease and are unable to predict disease progression, in particular, metastasis. As such, women with good prognostic outcomes are exposed to the side effects of therapies without added benefit. Furthermore, women with aggressive disease for whom these advanced treatments would deliver benefit cannot be distinguished and opportunities for more intensive or novel treatment are lost. This study is designed to identify novel factors associated with disease progression, and the potential to inform disease prognosis. Frequently overlooked, yet common mediators of disease are the interactions that take place between the insulin-like growth factor (IGF) system and the extracellular matrix (ECM). Our laboratory has previously demonstrated that multiprotein insulin-like growth factor-I (IGF-I): insulin-like growth factor binding protein (IGFBP): vitronectin (VN) complexes stimulate migration of breast cancer cells in vitro, via the cooperative involvement of the insulin-like growth factor type I receptor (IGF-IR) and VN-binding integrins. However, the effects of IGF and ECM protein interactions on the dissemination and progression of breast cancer in vivo are unknown. It was hypothesised that interactions between proteins required for IGF induced signalling events and those within the ECM contribute to breast cancer metastasis and are prognostic and predictive indicators of patient outcome. To address this hypothesis, semiquantitative immunohistochemistry (IHC) analyses were performed to compare the extracellular and subcellular distribution of IGF and ECM induced signalling proteins between matched normal, primary cancer, and metastatic cancer among archival formalin-fixed paraffin-embedded (FFPE) breast tissue samples collected from women attending the Princess Alexandra Hospital, Brisbane. Multivariate Cox proportional hazards (PH) regression survival models in conjunction with a modified „purposeful selection of covariates. method were applied to determine the prognostic potential of these proteins. This study provides the first in-depth, compartmentalised analysis of the distribution of IGF and ECM induced signalling proteins. As protein function and protein localisation are closely correlated, these findings provide novel insights into IGF signalling and ECM protein function during breast cancer development and progression. Distinct IGF signalling and ECM protein immunoreactivity was observed in the stroma and/or in subcellular locations in normal breast, primary cancer and metastatic cancer tissues. Analysis of the presence and location of stratifin (SFN) suggested a causal relationship in ECM remodelling events during breast cancer development and progression. The results of this study have also suggested that fibronectin (FN) and ¥â1 integrin are important for the formation of invadopodia and epithelial-to-mesenchymal transition (EMT) events. Our data also highlighted the importance of the temporal and spatial distribution of IGF induced signalling proteins in breast cancer metastasis; in particular, SFN, enhancer-of-split and hairy-related protein 2 (SHARP-2), total-akt/protein kinase B 1 (Total-AKT1), phosphorylated-akt/protein kinase B (P-AKT), extracellular signal-related kinase-1 and extracellular signal-related kinase-2 (ERK1/2) and phosphorylated-extracellular signal-related kinase-1 and extracellular signal-related kinase-2 (P-ERK1/2). Multivariate survival models were created from the immunohistochemical data. These models were found to fit well with these data with very high statistical confidence. Numerous prognostic confounding effects and effect modifications were identified among elements of the ECM and IGF signalling cascade and corroborate the survival models. This finding provides further evidence for the prognostic potential of IGF and ECM induced signalling proteins. In addition, the adjusted measures of associations obtained in this study have strengthened the validity and utility of the resulting models. The findings from this study provide insights into the biological interactions that occur during the development of breast tissue and contribute to disease progression. Importantly, these multivariate survival models could provide important prognostic and predictive indicators that assist the clinical management of breast disease, namely in the early identification of cancers with a propensity to metastasise, and/or recur following adjuvant therapy. The outcomes of this study further inform the development of new therapeutics to aid patient recovery. The findings from this study have widespread clinical application in the diagnosis of disease and prognosis of disease progression, and inform the most appropriate clinical management of individuals with breast cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology(1) even in complex tissue sections(2). Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells(3), however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Phenomenology is a term that has been described as a philosophy, a research paradigm, a methodology, and equated with qualitative research. In this paper first we clarify phenomenology by tracing its movement both as a philosophy and as a research method. Next we make a case for the use of phenomenology in empirical investigations of management phenomena. The paper discusses a selection of central concepts pertaining to phenomenology as a scientific research method, which include description, phenomenological reduction and free imaginative variation. In particular, the paper elucidates the efficacy of Giorgi’s descriptive phenomenological research praxis as a qualitative research method and how its utility can be applied in creating a deeper and richer understanding of management practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identification of the primary drivers of stock returns has been of great interest to both financial practitioners and academics alike for many decades. Influenced by classical financial theories such as the CAPM (Sharp, 1964; Lintner, 1965) and APT (Ross, 1976), a linear relationship is conventionally assumed between company characteristics as derived from their financial accounts and forward returns. Whilst this assumption may be a fair approximation to the underlying structural relationship, it is often adopted for the purpose of convenience. It is actually quite rare that the assumptions of distributional normality and a linear relationship are explicitly assessed in advance even though this information would help to inform the appropriate choice of modelling technique. Non-linear models have nevertheless been applied successfully to the task of stock selection in the past (Sorensen et al, 2000). However, their take-up by the investment community has been limited despite the fact that researchers in other fields have found them to be a useful way to express knowledge and aid decision-making...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a method for measuring the creative potential of computer games. The research approach applies a behavioral and verbal protocol to analyze the factors that influence the creative processes used by people as they play computer games from the puzzle genre. Creative potential is measured by examining task motivation and domain-relevant and creativity-relevant skills. This paper focuses on the reliability of the factors used for measurement, determining those factors that are more strongly related to creativity. The findings show that creative potential may be determined by examining the relationship between skills required and the effect of intrinsic motivation within game play activities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies time integration methods for large stiff systems of ordinary differential equations (ODEs) of the form u'(t) = g(u(t)). For such problems, implicit methods generally outperform explicit methods, since the time step is usually less restricted by stability constraints. Recently, however, explicit so-called exponential integrators have become popular for stiff problems due to their favourable stability properties. These methods use matrix-vector products involving exponential-like functions of the Jacobian matrix, which can be approximated using Krylov subspace methods that require only matrix-vector products with the Jacobian. In this paper, we implement exponential integrators of second, third and fourth order and demonstrate that they are competitive with well-established approaches based on the backward differentiation formulas and a preconditioned Newton-Krylov solution strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Client owners usually need an estimate or forecast of their likely building costs in advance of detailed design in order to confirm the financial feasibility of their projects. Because of their timing in the project life cycle, these early stage forecasts are characterized by the minimal amount of information available concerning the new (target) project to the point that often only its size and type are known. One approach is to use the mean contract sum of a sample, or base group, of previous projects of a similar type and size to the project for which the estimate is needed. Bernoulli’s law of large numbers implies that this base group should be as large as possible. However, increasing the size of the base group inevitably involves including projects that are less and less similar to the target project. Deciding on the optimal number of base group projects is known as the homogeneity or pooling problem. A method of solving the homogeneity problem is described involving the use of closed form equations to compare three different sampling arrangements of previous projects for their simulated forecasting ability by a cross-validation method, where a series of targets are extracted, with replacement, from the groups and compared with the mean value of the projects in the base groups. The procedure is then demonstrated with 450 Hong Kong projects (with different project types: Residential, Commercial centre, Car parking, Social community centre, School, Office, Hotel, Industrial, University and Hospital) clustered into base groups according to their type and size.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study examined the effects of pre-cooling duration on performance and neuromuscular function for self-paced intermittent-sprint shuttle running in the heat. Eight male, team-sport athletes completed two 35-min bouts of intermittent-sprint shuttle running separated by a 15-min recovery on three separate occasions (33°C, 34% relative humidity). Mixed-method pre-cooling was completed for 20 min (COOL20), 10-min (COOL10) or no cooling (CONT) and reapplied for 5-min mid-exercise. Performance was assessed via sprint times, percentage decline and shuttle-running distance covered. Maximal voluntary contractions (MVC), voluntary activation (VA) and evoked twitch properties were recorded pre- and post-intervention and mid- and post-exercise. Core temperature (T c), skin temperature, heart rate, capillary blood metabolites, sweat losses, perceptual exertion and thermal stress were monitored throughout. Venous blood draws pre- and post-exercise were analyzed for muscle damage and inflammation markers. Shuttle-running distances covered were increased 5.2 ± 3.3% following COOL20 (P < 0.05), with no differences observed between COOL10 and CONT (P > 0.05). COOL20 aided in the maintenance of mid- and post-exercise MVC (P < 0.05; d > 0.80), despite no conditional differences in VA (P > 0.05). Pre-exercise T c was reduced by 0.15 ± 0.13°C with COOL20 (P < 0.05; d > 1.10), and remained lower throughout both COOL20 and COOL10 compared to CONT (P < 0.05; d > 0.80). Pre-cooling reduced sweat losses by 0.4 ± 0.3 kg (P < 0.02; d > 1.15), with COOL20 0.2 ± 0.4 kg less than COOL10 (P = 0.19; d = 1.01). Increased pre-cooling duration lowered physiological demands during exercise heat stress and facilitated the maintenance of self-paced intermittent-sprint performance in the heat. Importantly, the dose-response interaction of pre-cooling and sustained neuromuscular responses may explain the improved exercise performance in hot conditions.