893 resultados para data analysis: algorithms and implementation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neurodegenerative disease. Disorders such as Alzheimer's disease (AD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This article reviews the advantages and limitations of the different methods of quantifying the abundance of pathological lesions in histological sections, including estimates of density, frequency, coverage, and the use of semiquantitative scores. The major sampling methods by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are also described. In addition, the data analysis methods commonly used to analyse quantitative data in neuropathology, including analyses of variance (ANOVA) and principal components analysis (PCA), are discussed. These methods are illustrated with reference to particular problems in the pathological diagnosis of AD and dementia with Lewy bodies (DLB).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports on the results of research into the connections between transaction attributes and buyer-supplier relationships (BSRs) in advanced manufacturing technology (AMT) acquisition and implementation. The investigation began by examining the impact of the different patterns of BSR on the performance of the AMT acquisition. In understanding the phenomena, the study drew upon and integrated the literature of transaction cost economics theory, BSRs, and AMT, and used this as the basis for a theoretical framework and hypotheses development. This framework was then empirically tested using data that were gathered through a questionnaire survey with 147 companies and analyzed using a structural equation modeling technique. The results of the analysis indicated that the higher the level of technological specificity and uncertainty, the more firms are likely to engage in a stronger relationship with technology suppliers. However, the complexity of the technology being implemented was associated with BSR only indirectly through its association with the level of uncertainty (which has a direct impact upon BSR). The analysis also provided strong support for the premise that developing strong BSR could lead to an improved performance in acquiring and implementing AMT. The implications of the study are offered for both the academic and practitioner audience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In some studies, the data are not measurements but comprise counts or frequencies of particular events. In such cases, an investigator may be interested in whether one specific event happens more frequently than another or whether an event occurs with a frequency predicted by a scientific model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PCA/FA is a method of analyzing complex data sets in which there are no clearly defined X or Y variables. It has multiple uses including the study of the pattern of variation between individual entities such as patients with particular disorders and the detailed study of descriptive variables. In most applications, variables are related to a smaller number of ‘factors’ or PCs that account for the maximum variance in the data and hence, may explain important trends among the variables. An increasingly important application of the method is in the ‘validation’ of questionnaires that attempt to relate subjective aspects of a patients experience with more objective measures of vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis reports the results of research into the connections between transaction attributes and buyer-supplier relationships (BSR) in advanced manufacturing technology (AMT) acquisitions and implementation. It also examines the impact of the different patterns of BSR on performance. Specifically, it addresses the issues of how the three transaction attributes; namely level of complexity, level of asset specificity, and level of uncertainty, can affect the relationships between the technology buyer and suppler in AMT acquisition and implementation, and then to see the impact of different patterns of BSR on the two aspect of performance; namely technology and implementation performance. In understanding the pohenomena, the study mainly draws on and integrates the literature of transaction cost economics theory,buyer-supplier relationships and advanced manufacturing technology as a basis of theoretical framework and hypotheses development.data were gathered through a questionnaire survey with 147 responses and seven semi-structured interviews of manufacturing firms in Malaysia. Quantitative data were analysed mainly using the AMOS (Analysis of Moment Structure) package for structural equation modeling and SPSS (Statistical Package for Social Science) for analysis of variance (ANOVA). Data from interview sessions were used to develop a case study with the intention of providing a richer and deeper understanding on the subject under investigation and to offer triangulation in the research process. he results of the questionnaire survey indicate that the higher the level of technological specificity and uncertainty, the more firms are likely to engage in a closer relationship with technology suppliers.However, the complexity of the technology being implemented is associated with BSR only because it is associated with the level of uncertainty that has direct impact upon BSR.The analysis also provides strong support for the premise that developing strong BSR could lead to an improved performance. However, with high levels of transaction attribute, implementation performance suffers more when firms have weak relationships with technology suppliers than with moderate and low levels of transaction attributes. The implications of the study are offered for both the academic and practitioner audience. The thesis closes with reports on its limitations and suggestions for further research that would address some of these limitations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTAMAP is a Web Processing Service for the automatic spatial interpolation of measured point data. Requirements were (i) using open standards for spatial data such as developed in the context of the Open Geospatial Consortium (OGC), (ii) using a suitable environment for statistical modelling and computation, and (iii) producing an integrated, open source solution. The system couples an open-source Web Processing Service (developed by 52°North), accepting data in the form of standardised XML documents (conforming to the OGC Observations and Measurements standard) with a computing back-end realised in the R statistical environment. The probability distribution of interpolation errors is encoded with UncertML, a markup language designed to encode uncertain data. Automatic interpolation needs to be useful for a wide range of applications and the algorithms have been designed to cope with anisotropy, extreme values, and data with known error distributions. Besides a fully automatic mode, the system can be used with different levels of user control over the interpolation process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We introduce a flexible visual data mining framework which combines advanced projection algorithms from the machine learning domain and visual techniques developed in the information visualization domain. The advantage of such an interface is that the user is directly involved in the data mining process. We integrate principled projection algorithms, such as generative topographic mapping (GTM) and hierarchical GTM (HGTM), with powerful visual techniques, such as magnification factors, directional curvatures, parallel coordinates and billboarding, to provide a visual data mining framework. Results on a real-life chemoinformatics dataset using GTM are promising and have been analytically compared with the results from the traditional projection methods. It is also shown that the HGTM algorithm provides additional value for large datasets. The computational complexity of these algorithms is discussed to demonstrate their suitability for the visual data mining framework. Copyright 2006 ACM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recent novel approach to the visualisation and analysis of datasets, and one which is particularly applicable to those of a high dimension, is discussed in the context of real applications. A feed-forward neural network is utilised to effect a topographic, structure-preserving, dimension-reducing transformation of the data, with an additional facility to incorporate different degrees of associated subjective information. The properties of this transformation are illustrated on synthetic and real datasets, including the 1992 UK Research Assessment Exercise for funding in higher education. The method is compared and contrasted to established techniques for feature extraction, and related to topographic mappings, the Sammon projection and the statistical field of multidimensional scaling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO INCOMPLETE PAPERWORK, ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A Világbank az 1990-es évek végén egy nagyszabású, nemzetközi teljesítmény-értékelési programot indított a víz- és szennyvíz-szolgáltató vállalatok körében. Az International Benchmarking Network for Water and Sanitation Utilities (IBNET) elnevezésű kezdeményezés keretében a szolgáltatók egy szabványosított kérdőíven információt adnak meg működésükről. Az egyedi, cégszintű adatokból egy adatbázis készül, mely lehetővé teszi a vállalatok működésének összehasonlító elemzését, amit teljesítmény-értékelésnek (benchmarking) is szokás nevezni. A programról és eddigi eredményeiről angol nyelvű információ a www.ib-net.org honlapon található. A felmérést eddig több, mint 70 országban végezték el, köztük Magyarországon is. Itthon a REKK kapott megbízást a feladat végrehajtására. Az adatgyűjtésen túl az adatbázisra alapozva Közép és Kelet-Európa országainak víziközmű cégeiről statisztikai elemzést végeztünk az alap adottságok és a költségek összefüggésének feltárására.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examines the congruency of planning between organizational structure and process, through an evaluation and planning model known as the Micro/Macro Dynamic Planning Grid. The model compares day-to-day planning within an organization to planning imposed by organizational administration and accrediting agencies. A survey instrument was developed to assess the micro and macro sociological analysis elements utilized by an organization.^ The Micro/Macro Dynamic Planning Grid consists of four quadrants. Each quadrant contains characteristics that reflect the interaction between the micro and macro elements of planning, objectives and goals within an organization. The Over Macro/Over Micro, Quadrant 1, contains attributes that reflect a tremendous amount of action and ongoing adjustments, typical of an organization undergoing significant changes in either leadership, program and/or structure. Over Macro/Under Micro, Quadrant 2, reflects planning characteristics found in large, bureaucratic systems with little regard given to the workings of their component parts. Under Macro/Under Micro, Quadrant 3, reflects the uncooperative, uncoordinated organization, one that contains a multiplicity of viewpoints, language, objectives and goals. Under Macro/Under Micro, Quadrant 4 represents the worst case scenario for any organization. The attributes of this quadrant are very reactive, chaotic, non-productive and redundant.^ There were three phases to the study: development of the initial instrument, pilot testing the initial instrument and item revision, and administration and assessment of the refined instrument. The survey instrument was found to be valid and reliable for the purposes and audiences herein described.^ In order to expand the applicability of the instrument to other organizational settings, the survey was administered to three professional colleges within a university.^ The first three specific research questions collectively answered, in the affirmative, the basic research question: Can the Micro/Macro Dynamic Planning Grid be applied to an organization through an organizational development tool? The first specific question: Can an instrument be constructed that applies the Micro/Macro Dynamic Planning Grid? The second specific research question: Is the constructed instrument valid and reliable? The third specific research question: Does an instrument that applies the Micro/Macro Dynamic Planning Grid assess congruency of micro and macro planning, goals and objectives within an organization? The fourth specific research question: What are the differences in the responses based on roles and responsibilities within an organization? involved statistical analysis of the response data and comparisons obtained with the demographic data. (Abstract shortened by UMI.) ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to explore the relationship between faculty perceptions, selected demographics, implementation of elements of transactional distance theory and online web-based course completion rates. This theory posits that the high transactional distance of online courses makes it difficult for students to complete these courses successfully; too often this is associated with low completion rates. Faculty members play an indispensable role in course design, whether online or face-to-face. They also influence course delivery format from design through implementation and ultimately to how students will experience the course. This study used transactional distance theory as the conceptual framework to examine the relationship between teaching and learning strategies used by faculty members to help students complete online courses. Faculty members' sex, number of years teaching online at the college, and their online course completion rates were considered. A researcher-developed survey was used to collect data from 348 faculty members who teach online at two prominent colleges in the southeastern part of United States. An exploratory factor analysis resulted in six factors related to transactional distance theory. The factors accounted for slightly over 65% of the variance of transactional distance scores as measured by the survey instrument. Results provided support for Moore's (1993) theory of transactional distance. Female faculty members scored higher in all the factors of transactional distance theory when compared to men. Faculty number of years teaching online at the college level correlated significantly with all the elements of transactional distance theory. Regression analysis was used to determine that two of the factors, instructor interface and instructor-learner interaction, accounted for 12% of the variance in student online course completion rates. In conclusion, of the six factors found, the two with the highest percentage scores were instructor interface and instructor-learner interaction. This finding, while in alignment with the literature concerning the dialogue element of transactional distance theory, brings a special interest to the importance of instructor interface as a factor. Surprisingly, based on the reviewed literature on transactional distance theory, faculty perceptions concerning learner-learner interaction was not an important factor and there was no learner-content interaction factor.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is an empirical study whose purpose was to examine the process of innovation adoption as an adaptive response by a public organization and its subunits existing under varying degrees of environmental uncertainty. Meshing organization innovation research and contingency theory to form a theoretical framework, an exploratory case study design was undertaken in a large, metropolitan government located in an area with the fourth highest prevalence rate of HIV/AIDS in the country. A number of environmental and organizational factors were examined for their influence upon decision making in the adoption/non-adoption as well as implementation of any number of AIDS-related policies, practices, and programs.^ The major findings of the study are as follows. For the county government itself (macro level), no AIDS-specific workplace policies have been adopted. AIDS activities (AIDS education, AIDS Task Force, AIDS Coordinator, etc.), adopted county-wide early in the epidemic, have all been abandoned. Worker infection rates, in the aggregate and throughout the epidemic have been small. As a result, absent co-worker conflict (isolated and negligible), no increase in employee health care costs, no litigation regarding discrimination, and no major impact on workforce productivity, AIDS has basically become a non-issue at the strategic core of the organization. At the departmental level, policy adoption decisions varied widely. Here the predominant issue is occupational risk, i.e., both objective as well as perceived. As expected, more AIDS-related activities (policies, practices, and programs) were found in departments with workers known to have significant risk for exposure to the AIDS virus (fire rescue, medical examiner, police, etc.). AIDS specific policies, in the form of OSHA's Bloodborn Pathogen Standard, took place primarily because they were legislatively mandated. Union participation varied widely, although not necessarily based upon worker risk. In several departments, the union was a primary factor bringing about adoption decisions. Additional factors were identified and included organizational presence of AIDS expertise, availability of slack resources, and the existence of a policy champion. Other variables, such as subunit size, centralization of decision making, and formalization were not consistent factors explaining adoption decisions. ^