96 resultados para General approach
Resumo:
BACKGROUND CONTEXT: The Neck Disability Index frequently is used to measure outcomes of the neck. The statistical rigor of the Neck Disability Index has been assessed with conflicting outcomes. To date, Confirmatory Factor Analysis of the Neck Disability Index has not been reported for a suitably large population study. Because the Neck Disability Index is not a condition-specific measure of neck function, initial Confirmatory Factor Analysis should consider problematic neck patients as a homogenous group. PURPOSE: We sought to analyze the factor structure of the Neck Disability Index through Confirmatory Factor Analysis in a symptomatic, homogeneous, neck population, with respect to pooled populations and gender subgroups. STUDY DESIGN: This was a secondary analysis of pooled data. PATIENT SAMPLE: A total of 1,278 symptomatic neck patients (67.5% female, median age 41 years), 803 nonspecific and 475 with whiplash-associated disorder. OUTCOME MEASURES: The Neck Disability Index was used to measure outcomes. METHODS: We analyzed pooled baseline data from six independent studies of patients with neck problems who completed Neck Disability Index questionnaires at baseline. The Confirmatory Factor Analysis was considered in three scenarios: the full sample and separate sexes. Models were compared empirically for best fit. RESULTS: Two-factor models have good psychometric properties across both the pooled and sex subgroups. However, according to these analyses, the one-factor solution is preferable from both a statistical perspective and parsimony. The two-factor model was close to significant for the male subgroup (p<.07) where questions separated into constructs of mental function (pain, reading headaches and concentration) and physical function (personal care, lifting, work, driving, sleep, and recreation). CONCLUSIONS: The Neck Disability Index demonstrated a one-factor structure when analyzed by Confirmatory Factor Analysis in a pooled, homogenous sample of neck problem patients. However, a two-factor model did approach significance for male subjects where questions separated into constructs of mental and physical function. Further investigations in different conditions, subgroup and sex-specific populations are warranted.
Resumo:
For most people, speech production is relatively effortless and error-free. Yet it has long been recognized that we need some type of control over what we are currently saying and what we plan to say. Precisely how we monitor our internal and external speech has been a topic of research interest for several decades. The predominant approach in psycholinguistics has assumed monitoring of both is accomplished via systems responsible for comprehending others' speech. This special topic aimed to broaden the field, firstly by examining proposals that speech production might also engage more general systems, such as those involved in action monitoring. A second aim was to examine proposals for a production-specific, internal monitor. Both aims require that we also specify the nature of the representations subject to monitoring.
Resumo:
The microbial mediated production of nitrous oxide (N2O) and its reduction to dinitrogen (N2) via denitrification represents a loss of nitrogen (N) from fertilised agro-ecosystems to the atmosphere. Although denitrification has received great interest by biogeochemists in the last decades, the magnitude of N2lossesand related N2:N2O ratios from soils still are largely unknown due to methodical constraints. We present a novel 15N tracer approach, based on a previous developed tracer method to study denitrification in pure bacterial cultures which was modified for the use on soil incubations in a completely automated laboratory set up. The method uses a background air in the incubation vessels that is replaced with a helium-oxygen gas mixture with a 50-fold reduced N2 background (2 % v/v). This method allows for a direct and sensitive quantification of the N2 and N2O emissions from the soil with isotope-ratio mass spectrometry after 15N labelling of denitrification N substrates and minimises the sensitivity to the intrusion of atmospheric N2 at the same time. The incubation set up was used to determine the influence of different soil moisture levels on N2 and N2O emissions from a sub-tropical pasture soil in Queensland/Australia. The soil was labelled with an equivalent of 50 μg-N per gram dry soil by broadcast application of KNO3solution (4 at.% 15N) and incubated for 3 days at 80% and 100% water filled pore space (WFPS), respectively. The headspace of the incubation vessel was sampled automatically over 12hrs each day and 3 samples (0, 6, and 12 hrs after incubation start) of headspace gas analysed for N2 and N2O with an isotope-ratio mass spectrometer (DELTA V Plus, Thermo Fisher Scientific, Bremen, Germany(. In addition, the soil was analysed for 15N NO3- and NH4+ using the 15N diffusion method, which enabled us to obtain a complete N balance. The method proved to be highly sensitive for N2 and N2O emissions detecting N2O emissions ranging from 20 to 627 μN kg-1soil-1hr-1and N2 emissions ranging from 4.2 to 43 μN kg-1soil-1hr-1for the different treatments. The main end-product of denitrification was N2O for both water contents with N2 accounting for 9% and 13% of the total denitrification losses at 80% and 100%WFPS, respectively. Between 95-100% of the added 15N fertiliser could be recovered. Gross nitrification over the 3 days amounted to 8.6 μN g-1 soil-1 and 4.7 μN g-1 soil-1, denitrification to 4.1 μN g-1 soil-1 and 11.8 μN g-1 soil-1at 80% and 100%WFPS, respectively. The results confirm that the tested method allows for a direct and highly sensitive detection of N2 and N2O fluxes from soils and hence offers a sensitive tool to study denitrification and N turnover in terrestrial agro-ecosystems.
Resumo:
Mechanical flexibility is considered an asset in consumer electronics and next-generation electronic systems. Printed and flexible electronic devices could be embedded into clothing or other surfaces at home or office or in many products such as low-cost sensors integrated in transparent and flexible surfaces. In this context inks based on graphene and related two-dimensional materials (2DMs) are gaining increasing attention owing to their exceptional (opto)electronic, electrochemical and mechanical properties. The current limitation relies on the use of solvents, providing stable dispersions of graphene and 2DMs and fitting the proper fluidic requirements for printing, which are in general not environmentally benign, and with high boiling point. Non-toxic and low boiling point solvents do not possess the required rheological properties (i.e., surface tension, viscosity and density) for the solution processing of graphene and 2DMs. Such solvents (e.g., water, alcohols) require the addition of stabilizing agents such as polymers or surfactants for the dispersion of graphene and 2DMs, which however unavoidably corrupt their properties, thus preventing their use for the target application. Here, we demonstrate a viable strategy to tune the fluidic properties of water/ethanol mixtures (low-boiling point solvents) to first effectively exfoliate graphite and then disperse graphene flakes to formulate graphene-based inks. We demonstrate that such inks can be used to print conductive stripes (sheet resistance of ~13 kΩ/□) on flexible substrates (polyethylene terephthalate), moving a step forward towards the realization of graphene-based printed electronic devices.
Resumo:
We consider the analysis of longitudinal data when the covariance function is modeled by additional parameters to the mean parameters. In general, inconsistent estimators of the covariance (variance/correlation) parameters will be produced when the "working" correlation matrix is misspecified, which may result in great loss of efficiency of the mean parameter estimators (albeit the consistency is preserved). We consider using different "Working" correlation models for the variance and the mean parameters. In particular, we find that an independence working model should be used for estimating the variance parameters to ensure their consistency in case the correlation structure is misspecified. The designated "working" correlation matrices should be used for estimating the mean and the correlation parameters to attain high efficiency for estimating the mean parameters. Simulation studies indicate that the proposed algorithm performs very well. We also applied different estimation procedures to a data set from a clinical trial for illustration.
Resumo:
Quasi-likelihood (QL) methods are often used to account for overdispersion in categorical data. This paper proposes a new way of constructing a QL function that stems from the conditional mean-variance relationship. Unlike traditional QL approaches to categorical data, this QL function is, in general, not a scaled version of the ordinary log-likelihood function. A simulation study is carried out to examine the performance of the proposed QL method. Fish mortality data from quantal response experiments are used for illustration.
Resumo:
One hundred and seven children with faecal incontinence were evaluated and managed over a 3 year period by a multidisciplinary team. After initial clinical assessment, evaluation of defaecatory mechanisms (using a balloon model) and assessment of personal-social development and self-concept were undertaken. Management was based on initial bowel evacuation, short-term laxatives, and habit training involving systematic use of positive reinforcement; 69 children received biofeedback conditioning. Idiopathic megacolon with constipation and soiling was the most common finding (98 cases). Other diagnoses included previously undiagnosed neurogenic bowel (three cases), post-surgical anal anomalies (four cases), and psychogenic encopresis (two cases). Idiopathic megacolon was characterized by decreased rectal sensation, increased threshold for external sphincter relaxation and an inability to evacuate. Faecal incontinence was associated with an undesirably low social self-concept (70% of the 40 evaluated), but was not related to a delay in development (mean general developmental quotient = 105 ± 8, for the 35 tested). Family psychopathology warranting referral for family therapy was found in 14 children (13%). The management programme yielded a short-term (3 months) cure rate of 68% and a long-term (12 months) cure rate of 90%, with 10% having continued soiling which varied from occasional to several incidents/week. No significant improvement in self-concept was observed overall, although marked improvements were observed in some children. We conclude that disordered defaecatory dynamics are a major determinant of faecal incontinence in children. Undesirably low social self-concepts but normal developmental ability accompany this condition. Management is facilitated by a multidisciplinary approach, acknowledging the role of both behavioural and physiological components of the problem. This approach is effective in eradicating soiling in the majority of cases, comparing favourably with other published data.
Resumo:
Mathematical models describing the movement of multiple interacting subpopulations are relevant to many biological and ecological processes. Standard mean-field partial differential equation descriptions of these processes suffer from the limitation that they implicitly neglect to incorporate the impact of spatial correlations and clustering. To overcome this, we derive a moment dynamics description of a discrete stochastic process which describes the spreading of distinct interacting subpopulations. In particular, we motivate our model by mimicking the geometry of two typical cell biology experiments. Comparing the performance of the moment dynamics model with a traditional mean-field model confirms that the moment dynamics approach always outperforms the traditional mean-field approach. To provide more general insight we summarise the performance of the moment dynamics model and the traditional mean-field model over a wide range of parameter regimes. These results help distinguish between those situations where spatial correlation effects are sufficiently strong, such that a moment dynamics model is required, from other situations where spatial correlation effects are sufficiently weak, such that a traditional mean-field model is adequate.
Resumo:
Aims and objectives To determine consensus across acute care specialty areas on core physical assessment skills necessary for early recognition of changes in patient status in general wards. Background Current approaches to physical assessment are inconsistent and have not evolved to meet increased patient and system demands. New models of nursing assessment are needed in general wards that ensure a proactive and patient safety approach. Design A modified Delphi study. Methods Focus group interviews with 150 acute care registered nurses (RNs) at a large tertiary referral hospital generated a framework of core skills that were developed into a web-based survey. We then sought consensus with a panel of 35 senior acute care RNs following a classical Delphi approach over three rounds. Consensus was predefined as at least 80% agreement for each skill across specialty areas. Results Content analysis of focus group transcripts identified 40 discrete core physical assessment skills. In the Delphi rounds, 16 of these were consensus validated as core skills and were conceptually aligned with the primary survey: (Airway) Assess airway patency; (Breathing) Measure respiratory rate, Evaluate work of breathing, Measure oxygen saturation; (Circulation) Palpate pulse rate and rhythm, Measure blood pressure by auscultation, Assess urine output; (Disability) Assess level of consciousness, Evaluate speech, Assess for pain; (Exposure) Measure body temperature, Inspect skin integrity, Inspect and palpate skin for signs of pressure injury, Observe any wounds, dressings, drains and invasive lines, Observe ability to transfer and mobilise, Assess bowel movements. Conclusions Among a large and diverse group of experienced acute care RNs consensus was achieved on a structured core physical assessment to detect early changes in patient status. Relevance to clinical practice Although further research is needed to refine the model, clinical application should promote systematic assessment and clinical reasoning at the bedside.
Resumo:
A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.
Resumo:
Objective: To nationally trial the Primary Care Practice Improvement Tool (PC-PIT), an organisational performance improvement tool previously co-created with Australian primary care practices to increase their focus on relevant quality improvement (QI) activities. Design: The study was conducted from March to December 2015 with volunteer general practices from a range of Australian primary care settings. We used a mixed-methods approach in two parts. Part 1 involved staff in Australian primary care practices assessing how they perceived their practice met (or did not meet) each of the 13 PC-PIT elements of high-performing practices, using a 1–5 Likert scale. In Part 2, two external raters conducted an independent practice visit to independently and objectively assess the subjective practice assessment from Part 1 against objective indicators for the 13 elements, using the same 1–5 Likert scale. Concordance between the raters was determined by comparing their ratings. In-depth interviews conducted during the independent practice visits explored practice managers’ experiences and perceived support and resource needs to undertake organisational improvement in practice. Results: Data were available for 34 general practices participating in Part 1. For Part 2, independent practice visits and the inter-rater comparison were conducted for a purposeful sample of 19 of the 34 practices. Overall concordance between the two raters for each of the assessed elements was excellent. Three practice types across a continuum of higher- to lower-scoring practices were identified, with each using the PC-PIT in a unique way. During the in-depth interviews, practice managers identified benefits of having additional QI tools that relate to the PC-PIT elements. Conclusions: The PC-PIT is an organisational performance tool that is acceptable, valid and relevant to our range of partners and the end users (general practices). Work is continuing with our partners and end users to embed the PC-PIT in existing organisational improvement programs.