905 resultados para computer-aided qualitative data analysis software
Resumo:
One of the disadvantages of old age is that there is more past than future: this,however, may be turned into an advantage if the wealth of experience and, hopefully,wisdom gained in the past can be reflected upon and throw some light on possiblefuture trends. To an extent, then, this talk is necessarily personal, certainly nostalgic,but also self critical and inquisitive about our understanding of the discipline ofstatistics. A number of almost philosophical themes will run through the talk: searchfor appropriate modelling in relation to the real problem envisaged, emphasis onsensible balances between simplicity and complexity, the relative roles of theory andpractice, the nature of communication of inferential ideas to the statistical layman, theinter-related roles of teaching, consultation and research. A list of keywords might be:identification of sample space and its mathematical structure, choices betweentransform and stay, the role of parametric modelling, the role of a sample spacemetric, the underused hypothesis lattice, the nature of compositional change,particularly in relation to the modelling of processes. While the main theme will berelevance to compositional data analysis we shall point to substantial implications forgeneral multivariate analysis arising from experience of the development ofcompositional data analysis…
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
Cobalt-labelled motoneuron dendrites of the frog spinal cord at the level of the second spinal nerve were photographed in the electron microscope from long series of ultrathin sections. Three-dimensional computer reconstructions of 120 dendrite segments were analysed. The samples were taken from two locations: proximal to cell body and distal, as defined in a transverse plane of the spinal cord. The dendrites showed highly irregular outlines with many 1-2 microns-long 'thorns' (on average 8.5 thorns per 100 microns 2 of dendritic area). Taken together, the reconstructed dendrite segments from the proximal sites had a total length of about 250 microns; those from the distal locations, 180 microns. On all segments together there were 699 synapses. Nine percent of the synapses were on thorns, and many more close to their base on the dendritic shaft. The synapses were classified in four groups. One third of the synapses were asymmetric with spherical vesicles; one half were symmetric with spherical vesicles; and one tenth were symmetric with flattened vesicles. A fourth, small class of asymmetric synapses had dense-core vesicles. The area of the active zones was large for the asymmetric synapses (median value 0.20 microns 2), and small for the symmetric ones (median value 0.10 microns 2), and the difference was significant. On average, the areas of the active zones of the synapses on thin dendrites were larger than those of synapses on large calibre dendrites. About every 4 microns 2 of dendritic area received one contact. There was a significant difference between the areas of the active zones of the synapses at the two locations. Moreover, the number per unit dendritic length was correlated with dendrite calibre. On average, the active zones covered more than 4% of the dendritic area; this value for thin dendrites was about twice as large as that of large calibre dendrites. We suggest that the larger active zones and the larger synaptic coverage of the thin dendrites compensate for the longer electrotonic distance of these synapses from the soma.
Resumo:
The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.
Resumo:
Background and Aims: The international EEsAI study group is currently developing an activity index for Eosinophilic Esophagitis (EoE). A potential discrepancy between patient and physician reported EoE symptoms has not been assessed yet. Therefore, we aimed to evaluate patient reported items describing their EoE activity and to compare these with the physicianʼs perception. Methods: A questionnaire was sent to 100 EoE patients in Switzerland. EoE-related symptoms dependent and independent of food intake were reported by patients. Results were analyzed using a qualitative content analysis and compared with symptoms reported by international EoE experts in Delphi rounds. Results: The questionnaire response rate was 64/100. The following items were developed by combining categories based on patients answers: food-consistency related dysphagia, frequency and severity of dysphagia, food impaction, strategies to avoid food impaction, food allergy, drinking-related retrosternal pain. The following food categories associated with dysphagia were identified: meat, rice, dry bread, French fries, raw, fibrous foods, others. Sports and psychological stress were identified as triggers for non-food intake related EoE symptoms. A good correlation was found between patient and physicianʼs reported EoE related symptoms. Conclusions: There is a good correlation between patient reported symptoms and the physicianʼs perception of clinical items as reported by international EoE experts. These patient reported outcomes will now be incorporated into the EEsAI questionnaire that measures EoE activity.
Resumo:
A computer-aided method to improve the thickness uniformity attainable when coating multiple substrates inside a thermal evaporation physical vapor deposition unit is presented. The study is developed for the classical spherical (dome-shaped) calotte and also for a plane sector reversible holder setup. This second arrangement is very useful for coating both sides of the substrate, such as antireflection multilayers on lenses. The design of static correcting shutters for both kinds of configurations is also discussed. Some results of using the method are presented as an illustration.
Resumo:
The class of Schoenberg transformations, embedding Euclidean distances into higher dimensional Euclidean spaces, is presented, and derived from theorems on positive definite and conditionally negative definite matrices. Original results on the arc lengths, angles and curvature of the transformations are proposed, and visualized on artificial data sets by classical multidimensional scaling. A distance-based discriminant algorithm and a robust multidimensional centroid estimate illustrate the theory, closely connected to the Gaussian kernels of Machine Learning.
Resumo:
In response to the mandate on Load and Resistance Factor Design (LRFD) implementations by the Federal Highway Administration (FHWA) on all new bridge projects initiated after October 1, 2007, the Iowa Highway Research Board (IHRB) sponsored these research projects to develop regional LRFD recommendations. The LRFD development was performed using the Iowa Department of Transportation (DOT) Pile Load Test database (PILOT). To increase the data points for LRFD development, develop LRFD recommendations for dynamic methods, and validate the results ofLRFD calibration, 10 full-scale field tests on the most commonly used steel H-piles (e.g., HP 10 x 42) were conducted throughout Iowa. Detailed in situ soil investigations were carried out, push-in pressure cells were installed, and laboratory soil tests were performed. Pile responses during driving, at the end of driving (EOD), and at re-strikes were monitored using the Pile Driving Analyzer (PDA), following with the CAse Pile Wave Analysis Program (CAPWAP) analysis. The hammer blow counts were recorded for Wave Equation Analysis Program (WEAP) and dynamic formulas. Static load tests (SLTs) were performed and the pile capacities were determined based on the Davisson’s criteria. The extensive experimental research studies generated important data for analytical and computational investigations. The SLT measured loaddisplacements were compared with the simulated results obtained using a model of the TZPILE program and using the modified borehole shear test method. Two analytical pile setup quantification methods, in terms of soil properties, were developed and validated. A new calibration procedure was developed to incorporate pile setup into LRFD.
Resumo:
The aim of this study was to determine the effect of using video analysis software on the interrater reliability of visual assessments of gait videos in children with cerebral palsy. Two clinicians viewed the same random selection of 20 sagittal and frontal video recordings of 12 children with cerebral palsy routinely acquired during outpatient rehabilitation clinics. Both observers rated these videos in a random sequence for each lower limb using the Observational Gait Scale, once with standard video software and another with video analysis software (Dartfish(®)) which can perform angle and timing measurements. The video analysis software improved interrater agreement, measured by weighted Cohen's kappas, for the total score (κ 0.778→0.809) and all of the items that required angle and/or timing measurements (knee position mid-stance κ 0.344→0.591; hindfoot position mid-stance κ 0.160→0.346; foot contact mid-stance κ 0.700→0.854; timing of heel rise κ 0.769→0.835). The use of video analysis software is an efficient approach to improve the reliability of visual video assessments.
Resumo:
This report presents the results of work zone field data analyzed on interstate highways in Missouri to determine the mean breakdown and queue-discharge flow rates as measures of capacity. Several days of traffic data collected at a work zone near Pacific, Missouri with a speed limit of 50 mph were analyzed in both the eastbound and westbound directions. As a result, a total of eleven breakdown events were identified using average speed profiles. The traffic flows prior to and after the onset of congestion were studied. Breakdown flow rates ranged between 1194 to 1404 vphpl, with an average of 1295 vphpl, and a mean queue discharge rate of 1072 vphpl was determined. Mean queue discharge, as used by the Highway Capacity Manual 2000 (HCM), in terms of pcphpl was found to be 1199, well below the HCM’s average capacity of 1600 pcphpl. This reduced capacity found at the site is attributable mainly to narrower lane width and higher percentage of heavy vehicles, around 25%, in the traffic stream. The difference found between mean breakdown flow (1295 vphpl) and queue-discharge flow (1072 vphpl) has been observed widely, and is due to reduced traffic flow once traffic breaks down and queues start to form. The Missouri DOT currently uses a spreadsheet for work zone planning applications that assumes the same values of breakdown and mean queue discharge flow rates. This study proposes that breakdown flow rates should be used to forecast the onset of congestion, whereas mean queue discharge flow rates should be used to estimate delays under congested conditions. Hence, it is recommended that the spreadsheet be refined accordingly.
Resumo:
Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.
Resumo:
There is a lack of dedicated tools for business model design at a strategic level. However, in today's economic world the need to be able to quickly reinvent a company's business model is essential to stay competitive. This research focused on identifying the functionalities that are necessary in a computer-aided design (CAD) tool for the design of business models in a strategic context. Using design science research methodology a series of techniques and prototypes have been designed and evaluated to offer solutions to the problem. The work is a collection of articles which can be grouped into three parts: First establishing the context of how the Business Model Canvas (BMC) is used to design business models and explore the way in which CAD can contribute to the design activity. The second part extends on this by proposing new technics and tools which support elicitation, evaluation (assessment) and evolution of business models design with CAD. This includes features such as multi-color tagging to easily connect elements, rules to validate coherence of business models and features that are adapted to the correct business model proficiency level of its users. A new way to describe and visualize multiple versions of a business model and thereby help in addressing the business model as a dynamic object was also researched. The third part explores extensions to the business model canvas such as an intermediary model which helps IT alignment by connecting business model and enterprise architecture. And a business model pattern for privacy in a mobile environment, using privacy as a key value proposition. The prototyped techniques and proposition for using CAD tools in business model modeling will allow commercial CAD developers to create tools that are better suited to the needs of practitioners.
Resumo:
In my thesis I present the findings of a multiple-case study on the CSR approach of three multinational companies, applying Basu and Palazzo's (2008) CSR-character as a process model of sensemaking, Suchman's (1995) framework on legitimation strategies, and Habermas (1996) concept of deliberative democracy. The theoretical framework is based on the assumption of a postnational constellation (Habermas, 2001) which sends multinational companies onto a process of sensemaking (Weick, 1995) with regards to their responsibilities in a globalizing world. The major reason is that mainstream CSR-concepts are based on the assumption of a liberal market economy embedded in a nation state that do not fit the changing conditions for legitimation of corporate behavior in a globalizing world. For the purpose of this study, I primarily looked at two research questions: (i) How can the CSR approach of a multinational corporation be systematized empirically? (ii) What is the impact of the changing conditions in the postnational constellation on the CSR approach of the studied multinational corporations? For the analysis, I adopted a holistic approach (Patton, 1980), combining elements of a deductive and inductive theory building methodology (Eisenhardt, 1989b; Eisenhardt & Graebner, 2007; Glaser & Strauss, 1967; Van de Ven, 1992) and rigorous qualitative data analysis. Primary data was collected through 90 semi-structured interviews in two rounds with executives and managers in three multinational companies and their respective stakeholders. Raw data originating from interview tapes, field notes, and contact sheets was processed, stored, and managed using the software program QSR NVIVO 7. In the analysis, I applied qualitative methods to strengthen the interpretative part as well as quantitative methods to identify dominating dimensions and patterns. I found three different coping behaviors that provide insights into the corporate mindset. The results suggest that multinational corporations increasingly turn towards relational approaches of CSR to achieve moral legitimacy in formalized dialogical exchanges with their stakeholders since legitimacy can no longer be derived only from a national framework. I also looked at the degree to which they have reacted to the postnational constellation by the assumption of former state duties and the underlying reasoning. The findings indicate that CSR approaches become increasingly comprehensive through integrating political strategies that reflect the growing (self-) perception of multinational companies as political actors. Based on the results, I developed a model which relates the different dimensions of corporate responsibility to the discussion on deliberative democracy, global governance and social innovation to provide guidance for multinational companies in a postnational world. With my thesis, I contribute to management research by (i) delivering a comprehensive critique of the mainstream CSR-literature and (ii) filling the gap of thorough qualitative research on CSR in a globalizing world using the CSR-character as an empirical device, and (iii) to organizational studies by further advancing a deliberative view of the firm proposed by Scherer and Palazzo (2008).