311 resultados para STATISTICAL METHODOLOGY
Resumo:
Over the past few decades a major paradigm shift has occurred in the conceptualisation of chronic pain as a complex multidimensional phenomenon. Yet, pain experienced by individuals with a primary disability continues to be understood largely from a traditional biomedical model, despite its inherent limitations. This is reflected in the body of literature on the topic that is primarily driven by positivist assumptions and the search for etiologic pain mechanisms. Conversely, little is known about the experiences of and meanings attributed to, disability-related pain. Thus the purpose of this paper is to discuss the use of focus group methodology in elucidating the meanings and experiences of this population. Here, a distinction is made between the method of the focus group and focus group research as methodology. Typically, the focus group is presented as a seemingly atheoretical method of research. Drawing on research undertaken on the impact of chronic pain in people with multiple sclerosis, this paper seeks to theorise the focus group in arguing the methodological congruence of focus group research and the study of pain experience. It is argued that the contributions of group interaction and shared experiences in focus group discussions produce data and insights less accessible through more structured research methods. It is concluded that a biopsychosocial perspective of chronic pain may only ever be appreciated when the person-in-context is the unit of investigation.
Resumo:
This chapter argues for the need to restructure children’s statistical experiences from the beginning years of formal schooling. The ability to understand and apply statistical reasoning is paramount across all walks of life, as seen in the variety of graphs, tables, diagrams, and other data representations requiring interpretation. Young children are immersed in our data-driven society, with early access to computer technology and daily exposure to the mass media. With the rate of data proliferation have come increased calls for advancing children’s statistical reasoning abilities, commencing with the earliest years of schooling (e.g., Langrall et al. 2008; Lehrer and Schauble 2005; Shaughnessy 2010; Whitin and Whitin 2011). Several articles (e.g., Franklin and Garfield 2006; Langrall et al. 2008) and policy documents (e.g., National Council of Teachers ofMathematics 2006) have highlighted the need for a renewed focus on this component of early mathematics learning, with children working mathematically and scientifically in dealing with realworld data. One approach to this component in the beginning school years is through data modelling (English 2010; Lehrer and Romberg 1996; Lehrer and Schauble 2000, 2007)...
Resumo:
In many bridges, vertical displacements are one of the most relevant parameters for structural health monitoring in both the short- and long-terms. Bridge managers around the globe are always looking for a simple way to measure vertical displacements of bridges. However, it is difficult to carry out such measurements. On the other hand, in recent years, with the advancement of fibre-optic technologies, fibre Bragg grating (FBG) sensors are more commonly used in structural health monitoring due to their outstanding advantages including multiplexing capability, immunity of electromagnetic interference as well as high resolution and accuracy. For these reasons, a methodology for measuring the vertical displacements of bridges using FBG sensors is proposed. The methodology includes two approaches. One of which is based on curvature measurements while the other utilises inclination measurements from successfully developed FBG tilt sensors. A series of simulation tests of a full-scale bridge was conducted. It shows that both approaches can be implemented to measure the vertical displacements for bridges with various support conditions, varying stiffness along the spans and without any prior known loading. A static loading beam test with increasing loads at the mid-span and a beam test with different loading locations were conducted to measure vertical displacements using FBG strain sensors and tilt sensors. The results show that the approaches can successfully measure vertical displacements.
Resumo:
Background A public health intervention program with active involvement of local related stakeholders was piloted in the Bien Hoa dioxin hot spot (2007-2009), and then expanded to the Da Nang dioxin hot spot in Vietnam (2009-2011). It aimed to reduce the risk of dioxin exposure through foods for local residents. This article presents the results of the intervention in Da Nang. Methodology To assess the results of this intervention program, pre-intervention and post-intervention knowledge-attitude-practice (KAP) surveys were implemented in 400 households, randomly selected from four wards surrounding Da Nang Airbase in 2009 and 2011, respectively. Results After the intervention, the knowledge on the existence of dioxin in food, dioxin exposure pathways, potential high risk foods and preventive measures significantly increased (p < 0.05). 98% were willing to follow advice on preventing dioxin exposure. Practices to reduce the risk of dioxin exposure also statistical significantly improved (p<0.05). After intervention, 60.4% of households undertook exposure preventive measures, significantly higher than that of the pre-intervention survey (39.6%; χ2 =40.15 , P<0.001). High risk foods had quite low rates of daily consumption (from 0% to 2.5%) and were significantly reduced (p<0.05). Conclusions This is seen as an effective intervention strategy toward reducing the risk of human exposure to dioxin at dioxin hot spots. While greater efforts are needed for remediating dioxin polluted areas inside airbases, there is also evidence to suggest that, during the past four decades, pollution has been expanding to the surrounding areas. For this reason, this model should be quickly expanded to the remaining dioxin hot spots in Vietnam to further reduce the exposure risk in these areas.
Resumo:
Purpose - Contemporary offshore Information System Development (ISD) outsourcing is becoming even more complex. Outsourcing partner has begun ‘re-outsourcing’ components of their projects to other outsourcing companies to minimize cost and gain efficiencies. This paper aims to explore intra-organizational Information Asymmetry of re-outsourced offshore ISD outsourcing projects. Design/methodology/approach - An online survey was conducted to get an overall view of Information Asymmetry between Principal and Agents (as per the Agency theory). Findings - Statistical analysis showed that there are significant differences between the Principal and Agent on clarity of requirements, common domain knowledge and communication effectiveness constructs, implying an unbalanced relationship between the parties. Moreover, our results showed that these three are significant measurement constructs of Information Asymmetry. Research limitations/implications - In our study we have only considered three main factors as common domain knowledge, clarity of requirements and communication effectiveness as three measurement constructs of Information Asymmetry. Therefore, researches are encouraged to test the proposed constructs further to increase its precision. Practical implications - Our analysis indicates significant differences in all three measurement constructs, implying the difficulties to ensure that the Agent is performing according to the requirements of the Principal. Using the Agency theory as theoretical view, this study sheds light on the best contract governing methods which minimize Information Asymmetry between the multiple partners within ISD outsourcing organizations. Originality/value - Currently, to the best of our knowledge, no study has undertaken research on Intra-organizational Information Asymmetry in re-outsourced offshore ISD outsourcing projects.
Resumo:
Spatial organisation of proteins according to their function plays an important role in the specificity of their molecular interactions. Emerging proteomics methods seek to assign proteins to sub-cellular locations by partial separation of organelles and computational analysis of protein abundance distributions among partially separated fractions. Such methods permit simultaneous analysis of unpurified organelles and promise proteome-wide localisation in scenarios wherein perturbation may prompt dynamic re-distribution. Resolving organelles that display similar behavior during a protocol designed to provide partial enrichment represents a possible shortcoming. We employ the Localisation of Organelle Proteins by Isotope Tagging (LOPIT) organelle proteomics platform to demonstrate that combining information from distinct separations of the same material can improve organelle resolution and assignment of proteins to sub-cellular locations. Two previously published experiments, whose distinct gradients are alone unable to fully resolve six known protein-organelle groupings, are subjected to a rigorous analysis to assess protein-organelle association via a contemporary pattern recognition algorithm. Upon straightforward combination of single-gradient data, we observe significant improvement in protein-organelle association via both a non-linear support vector machine algorithm and partial least-squares discriminant analysis. The outcome yields suggestions for further improvements to present organelle proteomics platforms, and a robust analytical methodology via which to associate proteins with sub-cellular organelles.
Resumo:
Purpose – The purpose of this paper is to develop an effective methodology for implementing lean manufacturing strategies and a leanness evaluation metric using continuous performance measurement (CPM). Design/methodology/approach – Based on five lean principles, a systematic lean implementation methodology for manufacturing organizations has been proposed. A simplified leanness evaluation metric consisting of both efficiency and effectiveness attributes of manufacturing performance has been developed for continuous evaluation of lean implementation. A case study to validate the proposed methodology has been conducted and proposed CPM metric has been used to assess the manufacturing leanness. Findings – Proposed methodology is able to systematically identify manufacturing wastes, select appropriate lean tools, identify relevant performance indicators, achieve significant performance improvement and establish lean culture in the organization. Continuous performance measurement matrices in terms of efficiency and effectiveness are proved to be appropriate methods for continuous evaluation of lean performance. Research limitations/implications – Effectiveness of the method developed has been demonstrated by applying it in a real life assembly process. However, more tests/applications will be necessary to generalize the findings. Practical implications – Results show that applying the methods developed, managers can successfully identify and remove manufacturing wastes from their production processes. By improving process efficiency, they can optimize their resource allocations. Manufacturers now have a validated step by step methodology for successfully implementing lean strategies. Originality/value – According to the authors’ best knowledge, this is the first known study that proposed a systematic lean implementation methodology based on lean principles and continuous improvement techniques. Evaluation of performance improvement by lean strategies is a critical issue. This study develops a simplified leanness evaluation metric considering both efficiency and effectiveness attributes and integrates it with the lean implementation methodology.
Resumo:
Exposure control or case-control methodologies are common techniques for estimating crash risks, however they require either observational data on control cases or exogenous exposure data, such as vehicle-kilometres travelled. This study proposes an alternative methodology for estimating crash risk of road user groups, whilst controlling for exposure under a variety of roadway, traffic and environmental factors by using readily available police-reported crash data. In particular, the proposed method employs a combination of a log-linear model and quasi-induced exposure technique to identify significant interactions among a range of roadway, environmental and traffic conditions to estimate associated crash risks. The proposed methodology is illustrated using a set of police-reported crash data from January 2004 to June 2009 on roadways in Queensland, Australia. Exposure-controlled crash risks of motorcyclists—involved in multi-vehicle crashes at intersections—were estimated under various combinations of variables like posted speed limit, intersection control type, intersection configuration, and lighting condition. Results show that the crash risk of motorcycles at three-legged intersections is high if the posted speed limits along the approaches are greater than 60 km/h. The crash risk at three-legged intersections is also high when they are unsignalized. Dark lighting conditions appear to increase the crash risk of motorcycles at signalized intersections, but the problem of night time conspicuity of motorcyclists at intersections is lessened on approaches with lower speed limits. This study demonstrates that this combined methodology is a promising tool for gaining new insights into the crash risks of road user groups, and is transferrable to other road users.
Resumo:
Porn studies researchers in the humanities have tended to use different research methods from those in social sciences. There has been surprisingly little conversation between the groups about methodology. This article presents a basic introduction to textual analysis and statistical analysis, aiming to provide for all porn studies researchers a familiarity with these two quite distinct traditions of data analysis. Comparing these two approaches, the article suggests that social science approaches are often strongly reliable – but can sacrifice validity to this end. Textual analysis is much less reliable, but has the capacity to be strongly valid. Statistical methods tend to produce a picture of human beings as groups, in terms of what they have in common, whereas humanities approaches often seek out uniqueness. Social science approaches have asked a more limited range of questions than have the humanities. The article ends with a call to mix up the kinds of research methods that are applied to various objects of study.
Resumo:
The methoxyamine group represents an ideal protecting group for the nitroxide moiety. It can be easily and selectively introduced in high yield (typically >90%) to a range of functionalised nitroxides using FeSO4.7H2O and H2O2 in DMSO. Its removal is readily achieved under mild conditions in high yield (70-90%) using mCPBA in a Cope-type elimination process.
Resumo:
A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.
Resumo:
Public health research consistently demonstrates the salience of neighbourhood as a determinant of both health-related behaviours and outcomes across the human life course. This paper will report on the findings from a mixed-methods Brisbane-based study that explores how mothers with primary school children from both high and low socioeconomic suburbs use the local urban environment for the purpose of physical activity. Firstly, we demonstrate findings from an innovative methodology using the geographic information systems (GIS) embedded in social media platforms on mobile phones to track locations, resource-use, distances travelled, and modes of transport of the families in real-time; and secondly, we report on qualitative data that provides insight into reasons for differential use of the environment by both groups. Spatial/mapping and statistical data showed that while the mothers from both groups demonstrated similar daily routines, the mothers from the high SEP suburb engaged in increased levels of physical activity, travelled less frequently and less distance by car, and walked more for transport. The qualitative data revealed differences in the psychosocial processes and characteristics of the households and neighbourhoods of the respective groups, with mothers in the lower SEP suburb reporting more stress, higher conflict, and lower quality relationships with neighbours.
Resumo:
BACKGROUND Pandemic influenza A (H1N1) has a significant public health impact. This study aimed to examine the effect of socio-ecological factors on the transmission of H1N1 in Brisbane, Australia. METHODOLOGY We obtained data from Queensland Health on numbers of laboratory-confirmed daily H1N1 in Brisbane by statistical local areas (SLA) in 2009. Data on weather and socio-economic index were obtained from the Australian Bureau of Meteorology and the Australian Bureau of Statistics, respectively. A Bayesian spatial conditional autoregressive (CAR) model was used to quantify the relationship between variation of H1N1 and independent factors and to determine its spatiotemporal patterns. RESULTS Our results show that average increase in weekly H1N1 cases were 45.04% (95% credible interval (CrI): 42.63-47.43%) and 23.20% (95% CrI: 16.10-32.67%), for a 1 °C decrease in average weekly maximum temperature at a lag of one week and a 10mm decrease in average weekly rainfall at a lag of one week, respectively. An interactive effect between temperature and rainfall on H1N1 incidence was found (changes: 0.71%; 95% CrI: 0.48-0.98%). The auto-regression term was significantly associated with H1N1 transmission (changes: 2.5%; 95% CrI: 1.39-3.72). No significant association between socio-economic indexes for areas (SEIFA) and H1N1 was observed at SLA level. CONCLUSIONS Our results demonstrate that average weekly temperature at lag of one week and rainfall at lag of one week were substantially associated with H1N1 incidence at a SLA level. The ecological factors seemed to have played an important role in H1N1 transmission cycles in Brisbane, Australia.
Resumo:
The Department of Culture and the Arts undertook the first mapping of Perth’s creative industries in 2007 in partnership with the City of Perth and the Departments of Industry and Resources and the Premier and Cabinet. The 2013 Creative Industries Statistical Analysis for Western Australia report has updated the mapping with the 2011 Census employment data to provide invaluable information for the State’s creative industries, their peak associations and potential investors. The report maps sector employment numbers and growth between the 2006 and 2011 Census in the areas of music, visual and performing arts, film, TV and radio, advertising and marketing, software and digital content, publishing, and architecture and design, which includes designer fashion.
Resumo:
Purpose The goal of this work was to set out a methodology for measuring and reporting small field relative output and to assess the application of published correction factors across a population of linear accelerators. Methods and materials Measurements were made at 6 MV on five Varian iX accelerators using two PTW T60017 unshielded diodes. Relative output readings and profile measurements were made for nominal square field sizes of side 0.5 to 1.0 cm. The actual in-plane (A) and cross-plane (B) field widths were taken to be the FWHM at the 50% isodose level. An effective field size, defined as FSeff=A·B, was calculated and is presented as a field size metric. FSeffFSeff was used to linearly interpolate between published Monte Carlo (MC) calculated kQclin,Qmsrfclin,fmsr values to correct for the diode over-response in small fields. Results The relative output data reported as a function of the nominal field size were different across the accelerator population by up to nearly 10%. However, using the effective field size for reporting showed that the actual output ratios were consistent across the accelerator population to within the experimental uncertainty of ±1.0%. Correcting the measured relative output using kQclin,Qmsrfclin,fmsr at both the nominal and effective field sizes produce output factors that were not identical but differ by much less than the reported experimental and/or MC statistical uncertainties. Conclusions In general, the proposed methodology removes much of the ambiguity in reporting and interpreting small field dosimetric quantities and facilitates a clear dosimetric comparison across a population of linacs