937 resultados para subset consistency


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ghrelin is a peptide hormone produced in the stomach and a range of other tissues, where it has endocrine, paracrine and autocrine roles in both normal and disease states. Ghrelin has been shown to be an important growth factor for a number of tumours, including prostate and breast cancers. In this study, we examined the expression of the ghrelin axis (ghrelin and its receptor, the growth hormone secretagogue receptor, GHSR) in endometrial cancer. Ghrelin is expressed in a range of endometrial cancer tissues, while its cognate receptor, GHSR1a, is expressed in a small subset of normal and cancer tissues. Low to moderately invasive endometrial cancer cell lines were examined by RT-PCR and immunoblotting, demonstrating that ghrelin axis mRNA and protein expression correlate with differentiation status of Ishikawa, HEC1B and KLE endometrial cancer cell lines. Moreover, treatment with ghrelin potently stimulated cell proliferation and inhibited cell death. Taken together, these data indicate that ghrelin promotes the progression of endometrial cancer cells in vitro, and may contribute to endometrial cancer pathogenesis and represent a novel treatment target.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Several researchers have reported that cultural and language differences can affect online interactions and communications between students from different cultural backgrounds. Other researchers have asserted that online learning is a tool that can improve teaching and learning skills, but its effectiveness depends on how the tool is used. To delve into these aspects further, this study set out to investigate the kinds of learning difficulties encountered by the international students and how they actually coped with online learning. The modified Online Learning Environment Survey (OLES) instrument was used to collect data from the sample of 109 international students at a university in Brisbane. A smaller group of 35 domestic students was also included for comparison purposes. Contrary to assumptions from previous research, the findings revealed that there were only few differences between the international Asian and Australian students with regards to their perceptions of online learning. Recommendations based on the findings of this research study were made for Australian universities where Asian international students study online. Specifically the recommendations highlighted the importance of upskilling of lecturers’ ability to structure their teaching online and to apply strong theoretical underpinnings when designing learning activities such as discussion forums, and for the university to establish a degree of consistency with regards to how content is located and displayed in a learning management system like Blackboard.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Motivated by the need of private set operations in a distributed environment, we extend the two-party private matching problem proposed by Freedman, Nissim and Pinkas (FNP) at Eurocrypt’04 to the distributed setting. By using a secret sharing scheme, we provide a distributed solution of the FNP private matching called the distributed private matching. In our distributed private matching scheme, we use a polynomial to represent one party’s dataset as in FNP and then distribute the polynomial to multiple servers. We extend our solution to the distributed set intersection and the cardinality of the intersection, and further we show how to apply the distributed private matching in order to compute distributed subset relation. Our work extends the primitives of private matching and set intersection by Freedman et al. Our distributed construction might be of great value when the dataset is outsourced and its privacy is the main concern. In such cases, our distributed solutions keep the utility of those set operations while the dataset privacy is not compromised. Comparing with previous works, we achieve a more efficient solution in terms of computation. All protocols constructed in this paper are provably secure against a semi-honest adversary under the Decisional Diffie-Hellman assumption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analysis of behavioural consistency is an important aspect of software engineering. In process and service management, consistency verification of behavioural models has manifold applications. For instance, a business process model used as system specification and a corresponding workflow model used as implementation have to be consistent. Another example would be the analysis to what degree a process log of executed business operations is consistent with the corresponding normative process model. Typically, existing notions of behaviour equivalence, such as bisimulation and trace equivalence, are applied as consistency notions. Still, these notions are exponential in computation and yield a Boolean result. In many cases, however, a quantification of behavioural deviation is needed along with concepts to isolate the source of deviation. In this article, we propose causal behavioural profiles as the basis for a consistency notion. These profiles capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities of a process model. Consistency based on these profiles is weaker than trace equivalence, but can be computed efficiently for a broad class of models. In this article, we introduce techniques for the computation of causal behavioural profiles using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets. We also elaborate on the findings of applying our technique to three industry model collections.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification of behavioural contradictions is an important aspect of software engineering, in particular for checking the consistency between a business process model used as system specification and a corresponding workflow model used as implementation. In this paper, we propose causal behavioural profiles as the basis for a consistency notion, which capture essential behavioural information, such as order, exclusiveness, and causality between pairs of activities. Existing notions of behavioural equivalence, such as bisimulation and trace equivalence, might also be applied as consistency notions. Still, they are exponential in computation. Our novel concept of causal behavioural profiles provides a weaker behavioural consistency notion that can be computed efficiently using structural decomposition techniques for sound free-choice workflow systems if unstructured net fragments are acyclic or can be traced back to S- or T-nets.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study two problems of online learning under restricted information access. In the first problem, prediction with limited advice, we consider a game of prediction with expert advice, where on each round of the game we query the advice of a subset of M out of N experts. We present an algorithm that achieves O(√(N/M)TlnN ) regret on T rounds of this game. The second problem, the multiarmed bandit with paid observations, is a variant of the adversarial N-armed bandit game, where on round t of the game we can observe the reward of any number of arms, but each observation has a cost c. We present an algorithm that achieves O((cNlnN) 1/3 T2/3+√TlnN ) regret on T rounds of this game in the worst case. Furthermore, we present a number of refinements that treat arm- and time-dependent observation costs and achieve lower regret under benign conditions. We present lower bounds that show that, apart from the logarithmic factors, the worst-case regret bounds cannot be improved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Lumbar Epidural Steroids Injections (ESI’s) have previously been shown to provide some degree of pain relief in sciatica. Number Needed To Treat (NNT) to achieve 50% pain relief has been estimated at 7 from the results of randomised controlled trials. Pain relief is temporary. They remain one of the most commonly provided procedures in the UK. It is unknown whether this pain relief represents good value for money. Methods 228 patients were randomised into a multi-centre Double Blind Randomised Controlled Trial. Subjects received up to 3 ESI’s or intra-spinous saline depending on response and fall off with the first injection. All other treatments were permitted. All received a review of analgesia, education and physical therapy. Quality of life was assessed using the SF36 at 6 points and compared using independent sample t-tests. Follow up was up to 1 yr. Missing data was imputed using last observation carried forward (LOCF). QALY’s (Quality of Life Years) were derived from preference based heath values (summary health utility score). SF-6D health state classification was derived from SF-36 raw score data. Standard gambles (SG) were calculated using Model 10. SG scores were calculated on trial results. LOCF was not used for this. Instead average SG were derived for a subset of patients with observations for all visits up to week 12. Incremental QALY’s were derived as the difference in the area between the SG curve for the active group and placebo group. Results SF36 domains showed a significant improvement in pain at week 3 but this was not sustained (mean 54 Active vs 61 Placebo P<0.05). Other domains did not show any significant gains compared with placebo. For derivation of SG the number in the sample in each period differed. In week 12, average SG scores for active and placebo converged. In other words, the health gain for the active group as measured by SG was achieved by the placebo group by week 12. The incremental QALY gained for a patient under the trial protocol compared with the standard care package was 0.0059350. This is equivalent to an additional 2.2 days of full health. The cost per QALY gained to the provider from a patient management strategy administering one epidural as suggested by results was £25 745.68. This result was derived assuming that the gain in QALY data calculated for patients under the trial protocol would approximate that under a patient management strategy based on the trial results (one ESI). This is above the threshold suggested by some as a cost effective treatment. Conclusions The transient benefit in pain relief afforded by ESI’s does not appear to be cost-effective. Further work is needed to develop more cost-effective conservative treatments for sciatica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Child Feeding Questionnaire (CFQ) developed by Birch and colleagues (2001) is a widely used tool for measuring parental feeding beliefs, attitudes and practices. However, the appropriateness of the CFQ for use with Chinese populations is unknown. This study tested the construct validity of a novel Chinese version of the CFQ using confirmatory factor analysis (CFA). Participants included a convenience sample of 254 Chinese-Australian mothers of children aged 1-4 years. Prior to testing, the questionnaire was translated into Chinese using a translation-back-translation method, one item was re-worded to be culturally appropriate, a new item was added (monitoring), and five items that were not age-appropriate for the sample were removed. Based on previous literature, both a 7-factor and an 8-factor model were assessed via CFA. Results showed that the 8-factor model, which separated restriction and use of food rewards, improved the conceptual clarity of the constructs and provided a good fit to the data. Internal consistency of all eight factors was acceptable (Cronbach’s α: .60−.93). This modified 8-factor CFQ appears to be a linguistically and culturally appropriate instrument for assessing feeding beliefs and practices in Chinese-Australian mothers of young children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Quality of life (QOL) measures are an important patient-relevant outcome measure for clinical studies. Currently there is no fully validated cough-specific QOL measure for paediatrics. The objective of this study was to validate a cough-specific QOL questionnaire for paediatric use. Method 43 children (28 males, 15 females; median age 29 months, IQR 20–41 months) newly referred for chronic cough participated. One parent of each child completed the 27-item Parent Cough-Specific QOL questionnaire (PC-QOL), and the generic child (Pediatric QOL Inventory 4.0 (PedsQL)) and parent QOL questionnaires (SF-12) and two cough-related measures (visual analogue score and verbal category descriptive score) on two occasions separated by 2–3 weeks. Cough counts were also objectively measured on both occasions. Results Internal consistency for both the domains and total PC-QOL at both test times was excellent (Cronbach alpha range 0.70–0.97). Evidence for repeatability and criterion validity was established, with significant correlations over time and significant relationships with the cough measures. The PC-QOL was sensitive to change across the test times and these changes were significantly related to changes in cough measures (PC-QOL with: verbal category descriptive score, rs=−0.37, p=0.016; visual analogue score, rs=−0.47, p=0.003). Significant correlations of the difference scores for the social domain of the PC-QOL and the domain and total scores of the PedsQL were also noted (rs=0.46, p=0.034). Conclusion The PC-QOL is a reliable and valid outcome measure that assesses QOL related to childhood cough at a given time point and measures changes in cough-specific QOL over time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The dose to skin surface is an important factor for many radiotherapy treatment techniques. It is known that TPS predicted surface doses can be significantly different from actual ICRP skin doses as defined at 70 lm. A number of methods have been implemented for the accurate determination of surface dose including use of specific dosimeters such as TLDs and radiochromic film as well as Monte Carlo calculations. Stereotactic radiosurgery involves delivering very high doses per treatment fraction using small X-ray fields. To date, there has been limited data on surface doses for these very small field sizes. The purpose of this work is to evaluate surface doses by both measurements and Monte Carlo calculations for very small field sizes. Methods All measurements were performed on a Novalis Tx linear accelerator which has a 6 MV SRS X-ray beam mode which uses a specially thin flattening filter. Beam collimation was achieved by circular cones with apertures that gave field sizes ranging from 4 to 30 mm at the isocentre. The relative surface doses were measured using Gafchromic EBT3 film which has the active layer at a depth similar to the ICRP skin dose depth. Monte Carlo calculations were performed using the BEAMnrc/EGSnrc Monte Carlo codes (V4 r225). The specifications of the linear accelerator, including the collimator, were provided by the manufacturer. Optimisation of the incident X-ray beam was achieved by an iterative adjustment of the energy, spatial distribution and radial spread of the incident electron beam striking the target. The energy cutoff parameters were PCUT = 0.01 MeV and ECUT = 0.700 - MeV. Directional bremsstrahlung splitting was switched on for all BEAMnrc calculations. Relative surface doses were determined in a layer defined in a water phantom of the same thickness and depth as compared to the active later in the film. Results Measured surface doses using the EBT3 film varied between 13 and 16 % for the different cones with an uncertainty of 3 %. Monte Carlo calculated surface doses were in agreement to better than 2 % to the measured doses for all the treatment cones. Discussion and conclusions This work has shown the consistency of surface dose measurements using EBT3 film with Monte Carlo predicted values within the uncertainty of the measurements. As such, EBT3 film is recommended for in vivo surface dose measurements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction The consistency of measuring small field output factors is greatly increased by reporting the measured dosimetric field size of each factor, as opposed to simply stating the nominal field size [1] and therefore requires the measurement of cross-axis profiles in a water tank. However, this makes output factor measurements time consuming. This project establishes at which field size the accuracy of output factors are not affected by the use of potentially inaccurate nominal field sizes, which we believe establishes a practical working definition of a ‘small’ field. The physical components of the radiation beam that contribute to the rapid change in output factor at small field sizes are examined in detail. The physical interaction that dominates the cause of the rapid dose reduction is quantified, and leads to the establishment of a theoretical definition of a ‘small’ field. Methods Current recommendations suggest that radiation collimation systems and isocentre defining lasers should both be calibrated to permit a maximum positioning uncertainty of 1 mm [2]. The proposed practical definition for small field sizes is as follows: if the output factor changes by ±1.0 % given a change in either field size or detector position of up to ±1 mm then the field should be considered small. Monte Carlo modelling was used to simulate output factors of a 6 MV photon beam for square fields with side lengths from 4.0 to 20.0 mm in 1.0 mm increments. The dose was scored to a 0.5 mm wide and 2.0 mm deep cylindrical volume of water within a cubic water phantom, at a depth of 5 cm and SSD of 95 cm. The maximum difference due to a collimator error of ±1 mm was found by comparing the output factors of adjacent field sizes. The output factor simulations were repeated 1 mm off-axis to quantify the effect of detector misalignment. Further simulations separated the total output factor into collimator scatter factor and phantom scatter factor. The collimator scatter factor was further separated into primary source occlusion effects and ‘traditional’ effects (a combination of flattening filter and jaw scatter etc.). The phantom scatter was separated in photon scatter and electronic disequilibrium. Each of these factors was plotted as a function of field size in order to quantify how each affected the change in small field size. Results The use of our practical definition resulted in field sizes of 15 mm or less being characterised as ‘small’. The change in field size had a greater effect than that of detector misalignment. For field sizes of 12 mm or less, electronic disequilibrium was found to cause the largest change in dose to the central axis (d = 5 cm). Source occlusion also caused a large change in output factor for field sizes less than 8 mm. Discussion and conclusions The measurement of cross-axis profiles are only required for output factor measurements for field sizes of 15 mm or less (for a 6 MV beam on Varian iX linear accelerator). This is expected to be dependent on linear accelerator spot size and photon energy. While some electronic disequilibrium was shown to occur at field sizes as large as 30 mm (the ‘traditional’ definition of small field [3]), it has been shown that it does not cause a greater change than photon scatter until a field size of 12 mm, at which point it becomes by far the most dominant effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As high-throughput genetic marker screening systems are essential for a range of genetics studies and plant breeding applications, the International RosBREED SNP Consortium (IRSC) has utilized the Illumina Infinium® II system to develop a medium- to high-throughput SNP screening tool for genome-wide evaluation of allelic variation in apple (Malus×domestica) breeding germplasm. For genome-wide SNP discovery, 27 apple cultivars were chosen to represent worldwide breeding germplasm and re-sequenced at low coverage with the Illumina Genome Analyzer II. Following alignment of these sequences to the whole genome sequence of 'Golden Delicious', SNPs were identified using SoapSNP. A total of 2,113,120 SNPs were detected, corresponding to one SNP to every 288 bp of the genome. The Illumina GoldenGate® assay was then used to validate a subset of 144 SNPs with a range of characteristics, using a set of 160 apple accessions. This validation assay enabled fine-tuning of the final subset of SNPs for the Illumina Infinium® II system. The set of stringent filtering criteria developed allowed choice of a set of SNPs that not only exhibited an even distribution across the apple genome and a range of minor allele frequencies to ensure utility across germplasm, but also were located in putative exonic regions to maximize genotyping success rate. A total of 7867 apple SNPs was established for the IRSC apple 8K SNP array v1, of which 5554 were polymorphic after evaluation in segregating families and a germplasm collection. This publicly available genomics resource will provide an unprecedented resolution of SNP haplotypes, which will enable marker-locus-trait association discovery, description of the genetic architecture of quantitative traits, investigation of genetic variation (neutral and functional), and genomic selection in apple.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim. This paper is a report of a development and validation of a new job performance scale based on an established job performance model. Background. Previous measures of nursing quality are atheoretical and fail to incorporate the complete range of behaviours performed. Thus, an up-to-date measure of job performance is required for assessing nursing quality. Methods. Test construction involved systematic generation of test items using focus groups, a literature review, and an expert review of test items. A pilot study was conducted to determine the multidimensional nature of the taxonomy and its psychometric properties. All data were collected in 2005. Findings. The final version of the nursing performance taxonomy included 41 behaviours across eight dimensions of job performance. Results from preliminary psychometric investigations suggest that the nursing performance scale has good internal consistency, good convergent validity and good criterion validity. Conclusion. The findings give preliminary support for a new job performance scale as a reliable and valid tool for assessing nursing quality. However, further research using a larger sample and nurses from a broader geographical region is required to cross-validate the measure. This scale may be used to guide hospital managers regarding the quality of nursing care within units and to guide future research in the area.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smartphone technology provides free or inexpensive access to mental health and wellbeing resources. As a result the use of mobile applications for these purposes has increased significantly in recent years. Yet, there is currently no app quality assessment alternative to the popular ‘star’-ratings, which are often unreliable. This presentation describes the development of the Mobile Application Rating Scale (MARS) a new measure for classifying and rating the quality of mobile applications. A review of existing literature on app and web quality identified 25 published papers, conference proceedings, and online resources (published since 1999), which identified 372 explicit quality criteria. Qualitative analysis identified five broad categories of app quality rating criteria: engagement, functionality, aesthetics, information quality, and overall satisfaction, which were refined into the 23-item MARS. Independent ratings of 50 randomly selected mental health and wellbeing mobile apps indicated the MARS had excellent levels of internal consistency (α = 0.92) and inter-rater reliability (ICC = 0.85). The MARS provides practitioners and researchers with an easy-to-use, simple, objective and reliable tool for assessing mobile app quality. It also provides mHealth professionals with a checklist for the design and development of high quality apps.