861 resultados para qualitative and quantitative methods
Resumo:
This doctoral thesis responds to the need for greater understanding of small businesses and their inherent unique problem-types. Integral to the investigation is the theme that for governments to effectively influence small business, a sound understanding of the factors they are seeking to influence is essential. Moreover, the study, in its recognition of the many shortcomings in management research and, in particular that the research methods and approaches adopted often fail to give adequate understanding of issues under study, attempts to develop an innovative and creative research approach. The aim thus being to produce, not only advances in small business management knowledge from the standpoints of government policy makers and `lq recipient small business, but also insights into future potential research method for the continued development of that knowledge. The origins of the methodology lay in the non-acceptance of traditional philosophical positions in epistemology and ontology, with a philosophical standpoint of internal realism underpinning the research. Internal realism presents the basis for the potential co-existence of qualitative and quantitative research strategy and underlines the crucial contributory role of research method in provision of ultimate factual status of the assertions of research findings. The concept of epistemological bootstrapping is thus used to develop a `lq partial research framework to foothold case study research, thereby avoiding limitations of objectivism and brute inductivism. The major insights and issues highlighted by the `lq bootstrap, guide the researcher around the participant case studies. A novel attempt at contextualist (linked multi-level and processual) analysis was attempted in the major in-depth case study, with two further cases playing a support role and contributing to a balanced emphasis of empirical research within the context of time constraints inherent within part-time research.
Resumo:
Healthcare providers and policy makers are faced with an ever-increasing number of medical publications. Searching for relevant information and keeping up to date with new research findings remains a constant challenge. It has been widely acknowledged that narrative reviews of the literature are susceptible to several types of bias and a systematic approach may protect against these biases. The aim of this thesis was to apply quantitative methods in the assessment of outcomes of topical therapies for psoriasis. In particular, to systematically examine the comparative efficacy, tolerability and cost-effectiveness of topical calcipotriol in the treatment of mild-to-moderate psoriasis. Over the years, a wide range of techniques have been used to evaluate the severity of psoriasis and the outcomes from treatment. This lack of standardisation complicates the direct comparison of results and ultimately the pooling of outcomes from different clinical trials. There is a clear requirement for more comprehensive tools for measuring drug efficacy and disease severity in psoriasis. Ideally, the outcome measures need to be simple, relevant, practical, and widely applicable, and the instruments should be reliable, valid and responsive. The results of the meta-analysis reported herein show that calcipotriol is an effective antipsoriatic agent. In the short-tenn, the pooled data found calcipotriol to be more effective than calcitriol, tacalcitol, coal tar and short-contact dithranol. Only potent corticosteroids appeared to have comparable efficacy, with less short-term side-effects. Potent corticosteroids also added to the antipsoriatic effect of calcipotriol, and appeared to suppress the occurrence of calcipotriol-induced irritation. There was insufficient evidence to support any large effects in favour of improvements in efficacy when calcipotriol is used in combination with systemic therapies in patients with severe psoriasis. However, there was a total absence of long-term morbidity data on the effectiveness of any of the interventions studied. Decision analysis showed that, from the perspective of the NHS as payer, the relatively small differences in efficacy between calcipotriol and short-contact dithranol lead to large differences in the direct cost of treating patients with mildto-moderate plaque psoriasis. Further research is needed to examine the clinical and economic issues affecting patients under treatment for psoriasis in the UK. In particular, the maintenance value and cost/benefit ratio for the various treatment strategies, and the assessment of patient's preferences has not yet been adequately addressed for this chronic recurring disease.
Resumo:
The last decade has seen a considerable increase in the application of quantitative methods in the study of histological sections of brain tissue and especially in the study of neurodegenerative disease. These disorders are characterised by the deposition and aggregation of abnormal or misfolded proteins in the form of extracellular protein deposits such as senile plaques (SP) and intracellular inclusions such as neurofibrillary tangles (NFT). Quantification of brain lesions and studying the relationships between lesions and normal anatomical features of the brain, including neurons, glial cells, and blood vessels, has become an important method of elucidating disease pathogenesis. This review describes methods for quantifying the abundance of a histological feature such as density, frequency, and 'load' and the sampling methods by which quantitative measures can be obtained including plot/quadrat sampling, transect sampling, and the point-quarter method. In addition, methods for determining the spatial pattern of a histological feature, i.e., whether the feature is distributed at random, regularly, or is aggregated into clusters, are described. These methods include the use of the Poisson and binomial distributions, pattern analysis by regression, Fourier analysis, and methods based on mapped point patterns. Finally, the statistical methods available for studying the degree of spatial correlation between pathological lesions and neurons, glial cells, and blood vessels are described.
Resumo:
This thesis describes advances in the characterisation, calibration and data processing of optical coherence tomography (OCT) systems. Femtosecond (fs) laser inscription was used for producing OCT-phantoms. Transparent materials are generally inert to infra-red radiations, but with fs lasers material modification occurs via non-linear processes when the highly focused light source interacts with the materials. This modification is confined to the focal volume and is highly reproducible. In order to select the best inscription parameters, combination of different inscription parameters were tested, using three fs laser systems, with different operating properties, on a variety of materials. This facilitated the understanding of the key characteristics of the produced structures with the aim of producing viable OCT-phantoms. Finally, OCT-phantoms were successfully designed and fabricated in fused silica. The use of these phantoms to characterise many properties (resolution, distortion, sensitivity decay, scan linearity) of an OCT system was demonstrated. Quantitative methods were developed to support the characterisation of an OCT system collecting images from phantoms and also to improve the quality of the OCT images. Characterisation methods include the measurement of the spatially variant resolution (point spread function (PSF) and modulation transfer function (MTF)), sensitivity and distortion. Processing of OCT data is a computer intensive process. Standard central processing unit (CPU) based processing might take several minutes to a few hours to process acquired data, thus data processing is a significant bottleneck. An alternative choice is to use expensive hardware-based processing such as field programmable gate arrays (FPGAs). However, recently graphics processing unit (GPU) based data processing methods have been developed to minimize this data processing and rendering time. These processing techniques include standard-processing methods which includes a set of algorithms to process the raw data (interference) obtained by the detector and generate A-scans. The work presented here describes accelerated data processing and post processing techniques for OCT systems. The GPU based processing developed, during the PhD, was later implemented into a custom built Fourier domain optical coherence tomography (FD-OCT) system. This system currently processes and renders data in real time. Processing throughput of this system is currently limited by the camera capture rate. OCTphantoms have been heavily used for the qualitative characterization and adjustment/ fine tuning of the operating conditions of OCT system. Currently, investigations are under way to characterize OCT systems using our phantoms. The work presented in this thesis demonstrate several novel techniques of fabricating OCT-phantoms and accelerating OCT data processing using GPUs. In the process of developing phantoms and quantitative methods, a thorough understanding and practical knowledge of OCT and fs laser processing systems was developed. This understanding leads to several novel pieces of research that are not only relevant to OCT but have broader importance. For example, extensive understanding of the properties of fs inscribed structures will be useful in other photonic application such as making of phase mask, wave guides and microfluidic channels. Acceleration of data processing with GPUs is also useful in other fields.
Resumo:
This paper advances a philosophically informed rationale for the broader, reflexive and practical application of arts-based methods to benefit research, practice and pedagogy. It addresses the complexity and diversity of learning and knowing, foregrounding a cohabitative position and recognition of a plurality of research approaches, tailored and responsive to context. Appreciation of art and aesthetic experience is situated in the everyday, underpinned by multi-layered exemplars of pragmatic visual-arts narrative inquiry undertaken in the third, creative and communications sectors. Discussion considers semi-guided use of arts-based methods as a conduit for topic engagement, reflection and intersubjective agreement; alongside observation and interpretation of organically employed approaches used by participants within daily norms. Techniques span handcrafted (drawing), digital (photography), hybrid (cartooning), performance dimensions (improvised installations) and music (metaphor and structure). The process of creation, the artefact/outcome produced and experiences of consummation are all significant, with specific reflexivity impacts. Exploring methodology and epistemology, both the "doing" and its interpretation are explicated to inform method selection, replication, utility, evaluation and development of cross-media skills literacy. Approaches are found engaging, accessible and empowering, with nuanced capabilities to alter relationships with phenomena, experiences and people. By building a discursive space that reduces barriers; emancipation, interaction, polyphony, letting-go and the progressive unfolding of thoughts are supported, benefiting ways of knowing, narrative (re)construction, sensory perception and capacities to act. This can also present underexplored researcher risks in respect to emotion work, self-disclosure, identity and agenda. The paper therefore elucidates complex, intricate relationships between form and content, the represented and the representation or performance, researcher and participant, and the self and other. This benefits understanding of phenomena including personal experience, sensitive issues, empowerment, identity, transition and liminality. Observations are relevant to qualitative and mixed methods researchers and a multidisciplinary audience, with explicit identification of challenges, opportunities and implications.
Resumo:
This dissertation analyzed and compared variables affecting interest rate and yield of certificates of participation, tax-exempt revenue bonds and tax-exempt general obligation bonds. The study employed qualitative and quantitative analysis methods. ^ Qualitative research methods included surveys, interviews and focus groups. The survey solicited debt load information from 67 Florida school districts (21 responded) and addressed the question which districts used certificates of participation and why. Eight individuals with experience dealing with all three debt instruments were interviewed. A follow-up focus group of six school district financial officers gathered additional data. Results from the qualitative methods revealed school districts used certificates of participation based on millage authority amount available relative to overall tax base. Also identified was the belief of a significant difference in certificates of participation costs and the other two debt instrument types. ^ The study's quantitative methods analyzed 1998 and 1999 initial issues of Moody's AAA rated certificates of participation, tax-exempt revenue bonds and tax-exempt general obligation bonds. Through an analysis of covariance (ANCOVA), the study examined interest rates and yields while controlling for the covariates of credit enhancement, issue size, and maturity date. The analysis identified no significant difference between interest rates of certificates of participation and tax-exempt general obligation bonds (p < 0.05). There was a significant difference between interest rates of tax-exempt revenue bonds and tax-exempt general obligation bonds. This study discerned no significant difference between yield on certificates of participation and tax-exempt general obligation bonds. It identified a difference in yield between both certificates of participation and tax-exempt general obligation bonds compared with tax-exempt revenue bonds. ^ The study found COPs to have lesser overall costs than RV bonds. COPs also have a quicker entry into the market resulting in construction cost savings. The study found policy implications such as investment portfolio limitations and public choice issues about using COPs as a mechanism to grow government. ^
Resumo:
Rapid development in industry have contributed to more complex systems that are prone to failure. In applications where the presence of faults may lead to premature failure, fault detection and diagnostics tools are often implemented. The goal of this research is to improve the diagnostic ability of existing FDD methods. Kernel Principal Component Analysis has good fault detection capability, however it can only detect the fault and identify few variables that have contribution on occurrence of fault and thus not precise in diagnosing. Hence, KPCA was used to detect abnormal events and the most contributed variables were taken out for more analysis in diagnosis phase. The diagnosis phase was done in both qualitative and quantitative manner. In qualitative mode, a networked-base causality analysis method was developed to show the causal effect between the most contributing variables in occurrence of the fault. In order to have more quantitative diagnosis, a Bayesian network was constructed to analyze the problem in probabilistic perspective.
Resumo:
In 2013 the European Commission launched its new green infrastructure strategy to make another attempt to stop and possibly reverse the loss of biodiversity until 2020, by connecting habitats in the wider landscape. This means that conservation would go beyond current practices to include landscapes that are dominated by conventional agriculture, where biodiversity conservation plays a minor role at best. The green infrastructure strategy aims at bottom-up rather than top-down implementation, and suggests including local and regional stakeholders. Therefore, it is important to know which stakeholders influence land-use decisions concerning green infrastructure at the local and regional level. The research presented in this paper served to select stakeholders in preparation for a participatory scenario development process to analyze consequences of different implementation options of the European green infrastructure strategy. We used a mix of qualitative and quantitative social network analysis (SNA) methods to combine actors’ attributes, especially concerning their perceived influence, with structural and relational measures. Further, our analysis provides information on institutional backgrounds and governance settings for green infrastructure and agricultural policy. The investigation started with key informant interviews at the regional level in administrative units responsible for relevant policies and procedures such as regional planners, representatives of federal ministries, and continued at the local level with farmers and other members of the community. The analysis revealed the importance of information flows and regulations but also of social pressure, considerably influencing biodiversity governance with respect to green infrastructure and biodiversity.
Study of white spot disease in four native species in Persian Gulf by histopathology and PCR methods
Resumo:
After serious disease outbreak, caused by new virus (WSV), has been occurring among cultured penaeid shrimps in Asian countries like China since 1993 and then in Latin American countries, during June till July 2002 a rapid and high mortality in cultured Penaeus indicus in Abadan region located in south of Iran with typical signs and symptoms of White Spot Syndrome Virus was confirmed by different studies of Histopathology, PCR, TEM, Virology. This study was conducted for the purpose of determination of prevalence(rate of infection)/ROI and grading severity (SOI) of WSD to five species: 150 samples of captured shrimps and 90 samples of cultured ones; Penaeus indicus, P. semisulcatus, P. merguiensis, Parapenaopsis styliferus, and Metapenaeus affinis in 2005. 136 of 240 samples have shown clinical and macroscopical signs & symptoms including; white spots on carapase (0.5-2 mm), easily removing of cuticule, fragility of hepatopancreas and red color of motility limbs. Histopathological changes like specific intranuclear inclusion bodies (cowdry-type A) were observed in all target tissues (gill, epidermis, haemolymph and midgut) but not in hepatopancreas, among shrimps collected from various farms in the south and captured ones from Persian Gulf, even ones without clinical signs. ROI among species estimated, using the NATIVIDAD & LIGHTNER formula(1992b) and SOI were graded, using a generalized scheme for assigning a numerical qualitative value to severity grade of infection which was provided by LIGHTNER(1996), in consideration to histopathology and counting specific inclusion bodies in different stages(were modified by B. Gholamhoseini). Samples with clinical signs, showed grades more than 2. Most of the P. semisulcatus and M. affinis samples showed grade of 3, in the other hand in most of P. styliferus samples grade of 4 were observed, which can suggest different sensitivity of different species. All samples were tested by Nested PCR method with IQTm 2000 WSSV kit and 183 of 240 samples were positive and 3 1evel of infection which was shown in this PCR confirmed our SOI grades, but they were more specified.
Resumo:
Abstract The potential impacts of climate change and environmental variability are already evident in most parts of the world, which is witnessing increasing temperature rates and prolonged flood or drought conditions that affect agriculture activities and nature-dependent livelihoods. This study was conducted in Mwanga District in the Kilimanjaro region of Tanzania to assess the nature and impacts of climate change and environmental variability on agriculture-dependent livelihoods and the adaptation strategies adopted by small-scale rural farmers. To attain its objective, the study employed a mixed methods approach in which both qualitative and quantitative techniques were used. The study shows that farmers are highly aware of their local environment and are conscious of the ways environmental changes affect their livelihoods. Farmers perceived that changes in climatic variables such as rainfall and temperature had occurred in their area over the period of three decades, and associated these changes with climate change and environmental variability. Farmers’ perceptions were confirmed by the evidence from rainfall and temperature data obtained from local and national weather stations, which showed that temperature and rainfall in the study area had become more variable over the past three decades. Farmers’ knowledge and perceptions of climate change vary depending on the location, age and gender of the respondents. The findings show that the farmers have limited understanding of the causes of climatic conditions and environmental variability, as some respondents associated climate change and environmental variability with social, cultural and religious factors. This study suggests that, despite the changing climatic conditions and environmental variability, farmers have developed and implemented a number of agriculture adaptation strategies that enable them to reduce their vulnerability to the changing conditions. The findings show that agriculture adaptation strategies employ both planned and autonomous adaptation strategies. However, the study shows that increasing drought conditions, rainfall variability, declining soil fertility and use of cheap farming technology are among the challenges that limit effective implementation of agriculture adaptation strategies. This study recommends further research on the varieties of drought-resilient crops, the development of small-scale irrigation schemes to reduce dependence on rain-fed agriculture, and the improvement of crop production in a given plot of land. In respect of the development of adaptation strategies, the study recommends the involvement of the local farmers and consideration of their knowledge and experience in the farming activities as well as the conditions of their local environment. Thus, the findings of this study may be helpful at various levels of decision making with regard to the development of climate change and environmental variability policies and strategies towards reducing farmers’ vulnerability to current and expected future changes.
Resumo:
Abstract: Quantitative Methods (QM) is a compulsory course in the Social Science program in CEGEP. Many QM instructors assign a number of homework exercises to give students the opportunity to practice the statistical methods, which enhances their learning. However, traditional written exercises have two significant disadvantages. The first is that the feedback process is often very slow. The second disadvantage is that written exercises can generate a large amount of correcting for the instructor. WeBWorK is an open-source system that allows instructors to write exercises which students answer online. Although originally designed to write exercises for math and science students, WeBWorK programming allows for the creation of a variety of questions which can be used in the Quantitative Methods course. Because many statistical exercises generate objective and quantitative answers, the system is able to instantly assess students’ responses and tell them whether they are right or wrong. This immediate feedback has been shown to be theoretically conducive to positive learning outcomes. In addition, the system can be set up to allow students to re-try the problem if they got it wrong. This has benefits both in terms of student motivation and reinforcing learning. Through the use of a quasi-experiment, this research project measured and analysed the effects of using WeBWorK exercises in the Quantitative Methods course at Vanier College. Three specific research questions were addressed. First, we looked at whether students who did the WeBWorK exercises got better grades than students who did written exercises. Second, we looked at whether students who completed more of the WeBWorK exercises got better grades than students who completed fewer of the WeBWorK exercises. Finally, we used a self-report survey to find out what students’ perceptions and opinions were of the WeBWorK and the written exercises. For the first research question, a crossover design was used in order to compare whether the group that did WeBWorK problems during one unit would score significantly higher on that unit test than the other group that did the written problems. We found no significant difference in grades between students who did the WeBWorK exercises and students who did the written exercises. The second research question looked at whether students who completed more of the WeBWorK exercises would get significantly higher grades than students who completed fewer of the WeBWorK exercises. The straight-line relationship between number of WeBWorK exercises completed and grades was positive in both groups. However, the correlation coefficients for these two variables showed no real pattern. Our third research question was investigated by using a survey to elicit students’ perceptions and opinions regarding the WeBWorK and written exercises. Students reported no difference in the amount of effort put into completing each type of exercise. Students were also asked to rate each type of exercise along six dimensions and a composite score was calculated. Overall, students gave a significantly higher score to the written exercises, and reported that they found the written exercises were better for understanding the basic statistical concepts and for learning the basic statistical methods. However, when presented with the choice of having only written or only WeBWorK exercises, slightly more students preferred or strongly preferred having only WeBWorK exercises. The results of this research suggest that the advantages of using WeBWorK to teach Quantitative Methods are variable. The WeBWorK system offers immediate feedback, which often seems to motivate students to try again if they do not have the correct answer. However, this does not necessarily translate into better performance on the written tests and on the final exam. What has been learned is that the WeBWorK system can be used by interested instructors to enhance student learning in the Quantitative Methods course. Further research may examine more specifically how this system can be used more effectively.
Resumo:
Background: Food allergy (FA) is a heavy burden for patients and their families and can significantly reduce the quality of life (QoL) of both. To provide adequate support, qualitative and quantitative evaluation of the parents' QoL may be helpful. The objective of this study is to develop and validate a Japanese version of the Food Allergy QoL QuestionnaireeParent Form (FAQLQ-PF-J), an internationally validated disease-specific QoL measurement of the parental burden of having a child with FA. Methods: The FAQLQ-PF and the Food Allergy Independent Measure (FAIM), an instrument to test the construct validity of the FAQLQ-PF-J, were translated into Japanese. After language validation, the questionnaires were administered to parents of FA children aged 0e12 years and those of age-matched healthy (without FA) children. Internal consistency (by Cronbach's a) and test-retest reliability were evaluated. Construct validity and discriminant validity were also examined. Results: One hundred twenty-seven parents of children with FA and 48 parents of healthy children filled out the questionnaire. The FAQLQ-PF-J showed excellent internal consistency (Cronbach's a > 0.77) and test-retest reliability. Good construct validity was demonstrated by significant correlations between the FAQLQ-PF-J and FAIM-J scores. It discriminated parents of children with FA from those without. The scores were significantly higher (lower QoL) for parents of FA children with a history of anaphylaxis than those without, for those with >6 FA-related symptoms experienced than those with less FA-related symptoms. Conclusions: The FAQLQ-PF-J is a reliable and valid measure of the parental burden of FA in children.
Resumo:
Organisations are increasingly investing in complex technological innovations such as enterprise information systems with the aim of improving the operations of the business, and in this way gaining competitive advantage. However, the implementation of technological innovations tends to have an excessive focus on either technology innovation effectiveness (also known as system effectiveness), or the resulting operational effectiveness; focusing on either one of them is detrimental to the long-term enterprise benefits through failure to achieve the real value of technological innovations. The lack of research on the dimensions and performance objectives that organisations must be focusing on is the main reason for this misalignment. This research uses a combination of qualitative and quantitative, three-stage methodological approach. Initial findings suggest that factors such as quality of information from technology innovation effectiveness, and quality and speed from operational effectiveness are important and significantly well correlated factors that promote the alignment between technology innovation effectiveness and operational effectiveness.
Resumo:
The author undertook a qualitative and quantitative survey of 130 guidance counsellors and primary school principles focusing on perceptions of what school guidance and counselling will be like in 25 years. Generally the participants held similar beliefs and were bullish about employment prospects.