932 resultados para Qualitative data analysis software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to describe the Supplemental Nursing Staff´s experiences at different hospital units. A qualitative phenomenological approach was conducted; a purposeful and theoretical sampling was implemented with supplemental nursing staff at Santa Barbara Hospital of Soria (Spain), to gain a more in-depth understanding of the Supplemental Nursing Staff ´s experience. Data were collected by in-depth interviews and through a field notebook. Data were analyzed using the Giorgi proposal. Twenty-one nurses with a mean age of 46 years were included. Three main topics emerged from the data analysis: building the first contact, carving out a niche and establishing interprofessional/interpersonal relationships. We conclude that the experience of hosting the supplemental nursing staff in changing clinical environments is conditioned by various factors. It is necessary for nurses and hospital managers to establish clear objectives with regard to the supplemental nursing staff´s role in the units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Differences in physico-chemical characteristics of bone grafts to fill bone defects have been demonstrated to influence in vitro bacterial biofilm formation. Aim of the study was to investigate in vivo staphylococcal biofilm formation on different calcium phosphate bone substitutes. A foreign-body guinea-pig infection model was used. Teflon cages prefilled with β-tricalcium phosphate, calcium-deficient hydroxyapatite, or dicalcium phosphate (DCP) scaffold were implanted subcutaneously. Scaffolds were infected with 2 × 10(3) colony-forming unit of Staphylococcus aureus (two strains) or S. epidermidis and explanted after 3, 24 or 72 h of biofilm formation. Quantitative and qualitative biofilm analysis was performed by sonication followed by viable counts, and microcalorimetry, respectively. Independently of the material, S. aureus formed increasing amounts of biofilm on the surface of all scaffolds over time as determined by both methods. For S. epidermidis, the biofilm amount decreased over time, and no biofilm was detected by microcalorimetry on the DCP scaffolds after 72 h of infection. However, when using a higher S. epidermidis inoculum, increasing amounts of biofilm were formed on all scaffolds as determined by microcalorimetry. No significant variation in staphylococcal in vivo biofilm formation was observed between the different materials tested. This study highlights the importance of in vivo studies, in addition to in vitro studies, when investigating biofilm formation of bone grafts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Short description of the proposed presentation * lees than 100 words This paper describes the interdisciplinary work done in Uspantán, Guatemala, a city vulnerable to natural hazards. We investigated local responses to landslides that happened in 2007 and 2010 and had a strong impact on the local community. We show a complete example of a systemic approach that incorporates physical, social and environmental aspects in order to understand risks. The objective of this work is to present the combination of social and geological data (mapping), and describe the methodology used for identification and assessment of risk. The article discusses both the limitations and methodological challenges encountered when conducting interdisciplinary research. Describe why it is important to present this topic at the Global Platform in less than 50 words This work shows the benefits of addressing risk in an interdisciplinary perspective, in particular how integrating social sciences can help identify new phenomena and natural hazards and assess risk. It gives a practical example of how one can integrate data from different fields. What is innovative about this presentation? * The use of mapping to combine qualitative and quantitative data. By coupling approaches, we could associate a hazard map with qualitative data gathered by interviews with the population. This map is an important document for the authorities. Indeed, it allows them to be aware of the most dangerous zones, the affected families and the places where it is most urgent to intervene.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider two fundamental properties in the analysis of two-way tables of positive data: the principle of distributional equivalence, one of the cornerstones of correspondence analysis of contingency tables, and the principle of subcompositional coherence, which forms the basis of compositional data analysis. For an analysis to be subcompositionally coherent, it suffices to analyse the ratios of the data values. The usual approach to dimension reduction in compositional data analysis is to perform principal component analysis on the logarithms of ratios, but this method does not obey the principle of distributional equivalence. We show that by introducing weights for the rows and columns, the method achieves this desirable property. This weighted log-ratio analysis is theoretically equivalent to spectral mapping , a multivariate method developed almost 30 years ago for displaying ratio-scale data from biological activity spectra. The close relationship between spectral mapping and correspondence analysis is also explained, as well as their connection with association modelling. The weighted log-ratio methodology is applied here to frequency data in linguistics and to chemical compositional data in archaeology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: After tobacco and alcohol, cannabis is the most used substance among adolescents in Switzerland. Our aim is to assess whether cannabis use has become an ordinary means of socialization. We hypothesize that cannabis consumption has become a normative, although still illegal, behavior. Methods: As part of a larger qualitative study aimed at assessing new ways [patterns] of cannabis consumption, 16 daily cannabis consumers (11 males) and 2 former heavy consumers (both females), aged 15 to 20 years, participated in interviews and focus groups. Data were transcribed verbatim and analyzed using Atlas.ti qualitative analysis software. Results: Most consumers define the beginning of their consumption as a moment when they made new friends. They commonly use cannabis in group settings, which encourages the belief that all adolescents use cannabis. Thus, cannabis is mainly identified as an everyday social act. Joints are smoked like cigarettes: at all times of the day, during or after school or work with peers, often starting at lunch break, and mostly in public places. Friends offer a joint in a group setting, much like beer in a bar, as a means of making contact. Consumption invariably increases while socializing on vacation: "During vacation, we smoke up to 10-15 joints a day; at the end we're just dead." Additionally, in order to obtain cannabis, consumers have to be part of the right networks; they generally have several dealers to assure their supply, buy and sell themselves, or practice group-buying. As a result, all friends or acquaintances of consumers are themselves cannabis users. For instance, 4 boys, who say they are best friends, always smoke together and that, in order to quit, "All four of us should say to ourselves, 'Okay, now, let's all stop smoking'. That would be the only solution. . .but it would be impossible!" The 2 former consumers state that when they started using cannabis, "I found myself little by little in a vicious circle where I saw only people who also smoked". When they quit, they separated from their group of friends: "Either you make new friends who don't smoke or you smoke." Conclusions: Discussions with consumers demonstrate a normative facet of cannabis consumption as part of teenage socialization. Consequently, cannabis consumers develop a significant dependency since a majority of their friends use cannabis and their consumption involves most of their daily social life. Our study highlights the need for clear messages about the harmful aspects of using this substance while also suggesting that cessation efforts should include helping users separate from their consumption milieu. Sources of Support: Dept. of Public Health of the canton of Vaud.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coverage and volume of geo-referenced datasets are extensive and incessantly¦growing. The systematic capture of geo-referenced information generates large volumes¦of spatio-temporal data to be analyzed. Clustering and visualization play a key¦role in the exploratory data analysis and the extraction of knowledge embedded in¦these data. However, new challenges in visualization and clustering are posed when¦dealing with the special characteristics of this data. For instance, its complex structures,¦large quantity of samples, variables involved in a temporal context, high dimensionality¦and large variability in cluster shapes.¦The central aim of my thesis is to propose new algorithms and methodologies for¦clustering and visualization, in order to assist the knowledge extraction from spatiotemporal¦geo-referenced data, thus improving making decision processes.¦I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical¦Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis:¦the Tree-structured Self-organizing Maps Component Planes. In addition, I present¦methodologies that combined with FGHSON and the Tree-structured SOM Component¦Planes allow the integration of space and time seamlessly and simultaneously in¦order to extract knowledge embedded in a temporal context.¦The originality of the FGHSON lies in its capability to reflect the underlying structure¦of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of¦clusters is crucial when data include complex structures with large variability of cluster¦shapes, variances, densities and number of clusters. The most important characteristics¦of the FGHSON include: (1) It does not require an a-priori setup of the number¦of clusters. (2) The algorithm executes several self-organizing processes in parallel.¦Hence, when dealing with large datasets the processes can be distributed reducing the¦computational cost. (3) Only three parameters are necessary to set up the algorithm.¦In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm¦lies in its ability to create a structure that allows the visual exploratory data analysis¦of large high-dimensional datasets. This algorithm creates a hierarchical structure¦of Self-Organizing Map Component Planes, arranging similar variables' projections in¦the same branches of the tree. Hence, similarities on variables' behavior can be easily¦detected (e.g. local correlations, maximal and minimal values and outliers).¦Both FGHSON and the Tree-structured SOM Component Planes were applied in¦several agroecological problems proving to be very efficient in the exploratory analysis¦and clustering of spatio-temporal datasets.¦In this thesis I also tested three soft competitive learning algorithms. Two of them¦well-known non supervised soft competitive algorithms, namely the Self-Organizing¦Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the¦third was our original contribution, the FGHSON. Although the algorithms presented¦here have been used in several areas, to my knowledge there is not any work applying¦and comparing the performance of those techniques when dealing with spatiotemporal¦geospatial data, as it is presented in this thesis.¦I propose original methodologies to explore spatio-temporal geo-referenced datasets¦through time. Our approach uses time windows to capture temporal similarities and¦variations by using the FGHSON clustering algorithm. The developed methodologies¦are used in two case studies. In the first, the objective was to find similar agroecozones¦through time and in the second one it was to find similar environmental patterns¦shifted in time.¦Several results presented in this thesis have led to new contributions to agroecological¦knowledge, for instance, in sugar cane, and blackberry production.¦Finally, in the framework of this thesis we developed several software tools: (1)¦a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called¦BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user¦interface tool which integrates the FGHSON algorithm with Google Earth in order to¦show zones with similar agroecological characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Whether for investigative or intelligence aims, crime analysts often face up the necessity to analyse the spatiotemporal distribution of crimes or traces left by suspects. This article presents a visualisation methodology supporting recurrent practical analytical tasks such as the detection of crime series or the analysis of traces left by digital devices like mobile phone or GPS devices. The proposed approach has led to the development of a dedicated tool that has proven its effectiveness in real inquiries and intelligence practices. It supports a more fluent visual analysis of the collected data and may provide critical clues to support police operations as exemplified by the presented case studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

MicroRNAs (miRs) are involved in the pathogenesis of several neoplasms; however, there are no data on their expression patterns and possible roles in adrenocortical tumors. Our objective was to study adrenocortical tumors by an integrative bioinformatics analysis involving miR and transcriptomics profiling, pathway analysis, and a novel, tissue-specific miR target prediction approach. Thirty-six tissue samples including normal adrenocortical tissues, benign adenomas, and adrenocortical carcinomas (ACC) were studied by simultaneous miR and mRNA profiling. A novel data-processing software was used to identify all predicted miR-mRNA interactions retrieved from PicTar, TargetScan, and miRBase. Tissue-specific target prediction was achieved by filtering out mRNAs with undetectable expression and searching for mRNA targets with inverse expression alterations as their regulatory miRs. Target sets and significant microarray data were subjected to Ingenuity Pathway Analysis. Six miRs with significantly different expression were found. miR-184 and miR-503 showed significantly higher, whereas miR-511 and miR-214 showed significantly lower expression in ACCs than in other groups. Expression of miR-210 was significantly lower in cortisol-secreting adenomas than in ACCs. By calculating the difference between dCT(miR-511) and dCT(miR-503) (delta cycle threshold), ACCs could be distinguished from benign adenomas with high sensitivity and specificity. Pathway analysis revealed the possible involvement of G2/M checkpoint damage in ACC pathogenesis. To our knowledge, this is the first report describing miR expression patterns and pathway analysis in sporadic adrenocortical tumors. miR biomarkers may be helpful for the diagnosis of adrenocortical malignancy. This tissue-specific target prediction approach may be used in other tumors too.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To assess the relation between cannabis and tobacco consumption among adolescents in Switzerland and whether cannabis and tobacco co-users can quit cigarette smoking. Methods: Based on individual interviews and focus groups, 22 youths aged 15-20 discussed cannabis consumption behaviours. Twenty (14 males) were cannabis consumers - of which 18 also smoked tobacco and 2 quit tobacco smoking - and 2 were former cannabis consumers (both females and daily smokers). Data were transcribed verbatim and analyzed using Atlas.ti qualitative analysis software. Results: Among the co-consumers, 9 started with tobacco, 7 with cannabis, and 2 with both. The main consumption mode among all cannabis consumers is joints, while other ways of consuming such as food preparations and water pipes are rare and experimental. Joints always mix cannabis with tobacco for 3 reasons: to burn correctly, pure cannabis is too strong, and smoking cannabis alone is too expensive. Two cannabis consumers - one former tobacco smoker and one occasional tobacco smoker - consider rolling tobacco less addictive than cigarette tobacco alone, and hence use it in their joints. Overall cannabis is considered 'natural' and less harmful to health than tobacco. Thus, many users describe their wish, in the longer term, to quit tobacco consumption without excluding occasional cannabis consumption. Nonetheless, all coconsumers declare that they smoke cigarettes as a substitute for cannabis: For example, "If I don't have a joint, I need fags; if I don't have fags, I need joints; and if I don't have anything, I go crazy!" or "About 20 minutes after smoking a joint we feel like smoking something again, because in the joint there is pure tobacco without a filter as in cigarettes, and that creates a crazy dependency!". Finally, all co-consumers state that the consumption of one of the substances increases when trying to diminish the other: "A few months ago I stopped smoking joints for a month. Well I was smoking more than a pack [of cigarettes] a day." Similarly, the 2 former cannabis consumers increased their cigarette use since quitting cannabis. Conclusions: The majority of cannabis users co-consume tobacco as a way of compensating for one substance or the other. Using tobacco within joints implies that there is a risk that even occasional joints can revive nicotine addiction. Consequently, health professionals wishing to help adolescents in substance use cessation and prevention efforts should consider both substances in a global perspective. Sources of Support: Dept. of Public Health of the canton of Vaud.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Monimutkaisen tietokonejärjestelmän suorituskykyoptimointi edellyttää järjestelmän ajonaikaisen käyttäytymisen ymmärtämistä. Ohjelmiston koon ja monimutkaisuuden kasvun myötä suorituskykyoptimointi tulee yhä tärkeämmäksi osaksi tuotekehitysprosessia. Tehokkaampien prosessorien käytön myötä myös energiankulutus ja lämmöntuotto ovat nousseet yhä suuremmiksi ongelmiksi, erityisesti pienissä, kannettavissa laitteissa. Lämpö- ja energiaongelmien rajoittamiseksi on kehitetty suorituskyvyn skaalausmenetelmiä, jotka edelleen lisäävät järjestelmän kompleksisuutta ja suorituskykyoptimoinnin tarvetta. Tässä työssä kehitettiin visualisointi- ja analysointityökalu ajonaikaisen käyttäytymisen ymmärtämisen helpottamiseksi. Lisäksi kehitettiin suorituskyvyn mitta, joka mahdollistaa erilaisten skaalausmenetelmien vertailun ja arvioimisen suoritusympäristöstä riippumatta, perustuen joko suoritustallenteen tai teoreettiseen analyysiin. Työkalu esittää ajonaikaisesti kerätyn tallenteen helposti ymmärrettävällä tavalla. Se näyttää mm. prosessit, prosessorikuorman, skaalausmenetelmien toiminnan sekä energiankulutuksen kolmiulotteista grafiikkaa käyttäen. Työkalu tuottaa myös käyttäjän valitsemasta osasta suorituskuvaa numeerista tietoa, joka sisältää useita oleellisia suorituskykyarvoja ja tilastotietoa. Työkalun sovellettavuutta tarkasteltiin todellisesta laitteesta saatua suoritustallennetta sekä suorituskyvyn skaalauksen simulointia analysoimalla. Skaalausmekanismin parametrien vaikutus simuloidun laitteen suorituskykyyn analysoitiin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this study was to analyse the nursing student-patient relationship and factors associated with this relationship from the point of view of both students and patients, and to identify factors that predict the type of relationship. The ultimate goal is to improve supervised clinical practicum with a view to supporting students in their reciprocal collaborative relationships with patients, increase their preparedness to meet patients’ health needs, and thus to enhance the quality of patient care. The study was divided into two phases. In the first phase (1999-2005), a literature review concerning the student-patient relationship was conducted (n=104 articles) and semi-structured interviews carried out with nursing students (n=30) and internal medicine patients (n=30). Data analysis was by means of qualitative content analysis and Student-Patient Relationship Scales, which were specially developed for this research. In the second phase (2005-2007), the data were collected by SPR scales among nursing students (n=290) and internal medicine patients (n=242). The data were analysed statistically by SPSS 12.0 software. The results revealed three types of student-patient relationship: a mechanistic relationship focusing on the student’s learning needs; an authoritative relationship focusing on what the student assumes is in the patient’s best interest; and a facilitative relationship focusing on the common good of both student and patient. Students viewed their relationship with patients more often as facilitative and authoritative than mechanistic, while in patients’ assessments the authoritative relationship occurred most frequently and the facilitative relationship least frequently. Furthermore, students’ and patients’ views on their relationships differed significantly. A number of background factors, contextual factors and consequences of the relationship were found to be associated with the type of relationship. In the student data, factors that predicted the type of relationship were age, current year of study and support received in the relationship with patient. The higher the student’s age, the more likely the relationship with the patient was facilitative. Fourth year studies and the support of a person other than a supervisor were significantly associated with an authoritative relationship. Among patients, several factors were found to predict the type of nursing student-patient relationships. Significant factors associated with a facilitative relationship were university-level education, several previous hospitalizations, admission to hospital for a medical problem, experience of caring for an ill family member and patient’s positive perception of atmosphere during collaboration and of student’s personal and professional growth. In patients, positive perceptions of student’s personal and professional attributes and patient’s improved health and a greater commitment to self-care, on the other hand, were significantly associated with an authoritative relationship, whereas positive perceptions of one’s own attributes as a patient were significantly associated with a mechanistic relationship. It is recommended that further research on the student-patient relationship and related factors should focus on questions of content, methodology and education.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evaluation of investments in advanced technology is one of the most important decision making tasks. The importance is even more pronounced considering the huge budget concerning the strategic, economic and analytic justification in order to shorten design and development time. Choosing the most appropriate technology requires an accurate and reliable system that can lead the decision makers to obtain such a complicated task. Currently, several Information and Communication Technologies (ICTs) manufacturers that design global products are seeking local firms to act as their sales and services representatives (called distributors) to the end user. At the same time, the end user or customer is also searching for the best possible deal for their investment in ICT's projects. Therefore, the objective of this research is to present a holistic decision support system to assist the decision maker in Small and Medium Enterprises (SMEs) - working either as individual decision makers or in a group - in the evaluation of the investment to become an ICT's distributor or an ICT's end user. The model is composed of the Delphi/MAH (Maximising Agreement Heuristic) Analysis, a well-known quantitative method in Group Support System (GSS), which is applied to gather the average ranking data from amongst Decision Makers (DMs). After that the Analytic Network Process (ANP) analysis is brought in to analyse holistically: it performs quantitative and qualitative analysis simultaneously. The illustrative data are obtained from industrial entrepreneurs by using the Group Support System (GSS) laboratory facilities at Lappeenranta University of Technology, Finland and in Thailand. The result of the research, which is currently implemented in Thailand, can provide benefits to the industry in the evaluation of becoming an ICT's distributor or an ICT's end user, particularly in the assessment of the Enterprise Resource Planning (ERP) programme. After the model is put to test with an in-depth collaboration with industrial entrepreneurs in Finland and Thailand, the sensitivity analysis is also performed to validate the robustness of the model. The contribution of this research is in developing a new approach and the Delphi/MAH software to obtain an analysis of the value of becoming an ERP distributor or end user that is flexible and applicable to entrepreneurs, who are looking for the most appropriate investment to become an ERP distributor or end user. The main advantage of this research over others is that the model can deliver the value of becoming an ERP distributor or end user in a single number which makes it easier for DMs to choose the most appropriate ERP vendor. The associated advantage is that the model can include qualitative data as well as quantitative data, as the results from using quantitative data alone can be misleading and inadequate. There is a need to utilise quantitative and qualitative analysis together, as can be seen from the case studies.