48 resultados para computer-aided qualitative data analysis software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

GODIVA2 is a dynamic website that provides visual access to several terabytes of physically distributed, four-dimensional environmental data. It allows users to explore large datasets interactively without the need to install new software or download and understand complex data. Through the use of open international standards, GODIVA2 maintains a high level of interoperability with third-party systems, allowing diverse datasets to be mutually compared. Scientists can use the system to search for features in large datasets and to diagnose the output from numerical simulations and data processing algorithms. Data providers around Europe have adopted GODIVA2 as an INSPIRE-compliant dynamic quick-view system for providing visual access to their data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Technology-enhanced or Computer Aided Learning (e-learning) can be institutionally integrated and supported by learning management systems or Virtual Learning Environments (VLEs) to offer efficiency gains, effectiveness and scalability of the e-leaning paradigm. However this can only be achieved through integration of pedagogically intelligent approaches and lesson preparation tools environment and VLE that is well accepted by both the students and teachers. This paper critically explores some of the issues relevant to scalable routinisation of e-learning at the tertiary level, typically first year university undergraduates, with the teaching of Relational Data Analysis (RDA), as supported by multimedia authoring, as a case study. The paper concludes that blended learning approaches which balance the deployment of e-learning with other modalities of learning delivery such as instructor–mediated group learning etc offer the most flexible and scalable route to e-learning but that this requires the graceful integration of platforms for multimedia production, distribution and delivery through advanced interactive spaces that provoke learner engagement and promote learning autonomy and group learning facilitated by a cooperative-creative learning environment that remains open to personal exploration of constructivist-constructionist pathways to learning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The role of users is an often-overlooked aspect of studies of innovation and diffusion. Using an actor-network theory (ANT) approach, four case studies examine the processes of implementing a piece of CAD (computer aided design) software, BSLink, in different organisations and describe the tailoring done by users to embed the software into working practices. This not only results in different practices of use at different locations, but also transforms BSLink itself into a proliferation of BSLinks-in-use. A focus group for BSLink users further reveals the gaps between different users' expectations and ways of using the software, and between different BSLinks-in-use. It also demonstrates the contradictory demands this places on its further development. The ANT-informed approach used treats both innovation and diffusion as processes of translation within networks. It also emphasises the political nature of innovation and implementation, and the efforts of various actors to delegate manoeuvres for increased influence onto technological artefacts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtual reality has the potential to improve visualisation of building design and construction, but its implementation in the industry has yet to reach maturity. Present day translation of building data to virtual reality is often unidirectional and unsatisfactory. Three different approaches to the creation of models are identified and described in this paper. Consideration is given to the potential of both advances in computer-aided design and the emerging standards for data exchange to facilitate an integrated use of virtual reality. Commonalities and differences between computer-aided design and virtual reality packages are reviewed, and trials of current system, are described. The trials have been conducted to explore the technical issues related to the integrated use of CAD and virtual environments within the house building sector of the construction industry and to investigate the practical use of the new technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The organization of non-crystalline polymeric materials at a local level, namely on a spatial scale between a few and 100 a, is still unclear in many respects. The determination of the local structure in terms of the configuration and conformation of the polymer chain and of the packing characteristics of the chain in the bulk material represents a challenging problem. Data from wide-angle diffraction experiments are very difficult to interpret due to the very large amount of information that they carry, that is the large number of correlations present in the diffraction patterns.We describe new approaches that permit a detailed analysis of the complex neutron diffraction patterns characterizing polymer melts and glasses. The coupling of different computer modelling strategies with neutron scattering data over a wide Q range allows the extraction of detailed quantitative information on the structural arrangements of the materials of interest. Proceeding from modelling routes as diverse as force field calculations, single-chain modelling and reverse Monte Carlo, we show the successes and pitfalls of each approach in describing model systems, which illustrate the need to attack the data analysis problem simultaneously from several fronts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The UK Department for Environment, Food and Rural Affairs (Defra) identified practices to reduce the risk of animal disease outbreaks. We report on the response of sheep and pig farmers in England to promotion of these practices. A conceptual framework was established from research on factors influencing adoption of animal health practices, linking knowledge, attitudes, social influences and perceived constraints to the implementation of specific practices. Qualitative data were collected from nine sheep and six pig enterprises in 2011. Thematic analysis explored attitudes and responses to the proposed practices, and factors influencing the likelihood of implementation. Most feel they are doing all they can reasonably do to minimise disease risk and that practices not being implemented are either not relevant or ineffective. There is little awareness and concern about risk from unseen threats. Pig farmers place more emphasis than sheep farmers on controlling wildlife, staff and visitor management and staff training. The main factors that influence livestock farmers’ decision on whether or not to implement a specific disease risk measure are: attitudes to, and perceptions of, disease risk; attitudes towards the specific measure and its efficacy; characteristics of the enterprise which they perceive as making a measure impractical; previous experience of a disease or of the measure; and the credibility of information and advice. Great importance is placed on access to authoritative information with most seeing vets as the prime source to interpret generic advice from national bodies in the local context. Uptake of disease risk measures could be increased by: improved risk communication through the farming press and vets to encourage farmers to recognise hidden threats; dissemination of credible early warning information to sharpen farmers’ assessment of risk; and targeted information through training events, farming press, vets and other advisers, and farmer groups, tailored to the different categories of livestock farmer.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Construction professional services (CPSs), such as architecture, engineering, and consultancy, are not only high value-added profit centers in their own right but also have a knock-on effect on other businesses, such as construction and the export of materials and machinery. Arguably, competition in the international construction market has shifted to these knowledge-intensive CPS areas. Yet CPSs represent a research frontier that has received scant attention. This research aims to enrich the body of knowledge on CPSs by examining strengths, weaknesses, opportunities, and threats (SWOT) of Chinese CPSs (CCPSs) in the international context. It does so by triangulating theories with quantitative and qualitative data gleaned from yearbooks, annual reports, interviews, seminars, and interactions with managers in major CCPS companies. It is found that CCPSs present both strengths and weaknesses in talents, administration systems, and development strategies in dealing with the external opportunities and threats brought about by globalization and market evolution. Low price, which has helped the Chinese construction business to succeed in the international market, is also a major CCPS strength. An opportunity for CCPSs is the relatively strong delivery capability possessed by Chinese contractors; by partnering with them CCPSs can better establish themselves in the international arena. This is probably the first ever comprehensive study on the performance of CCPSs in the international marketplace. The research is conducted at an opportune time, particularly when the world is witnessing the burgeoning force of Chinese businesses in many areas including manufacturing, construction, and, potentially, professional services. It adds new insights to the knowledge body of CPSs and provides valuable references to other countries faced with the challenge of developing CPS business efficiently in the international market.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In contrast to their bustling construction counterparts, Chinese construction professional services (CPS) such as architecture, engineering, and consultancy, seem still to be stagnant in the international market. CPS are not only high value-added profit centers in their own right, but also have a knock-on effect on subsequent businesses such as construction, and the export of materials and machinery. Arguably, competition in the international construction market has shifted to knowledge-intensive CPS. Yet,CPS represent a research area that has been paid scant attention. This research aims to add to the body of knowledge of CPS by examining strengths, weaknesses, opportunities, and threats (SWOT) of Chinese CPS (CCPS) in the international context. It does so by triangulating theories with quantitative and qualitative data gleaned from yearbooks, annual reports, interviews, seminars, and interactions with managers in major CCPS companies. It is found that CCPS present both strengths and weaknesses in talents, administration systems, and development strategies in dealing with the external opportunities and threats brought about by globalization and market evolvement. Low price, which has helped the Chinese construction business to succeed in the international market, is also a CCPS major strength. An opportunity for CCPS is the relatively strong delivery capability possessed by Chinese contractors. By partnering with them CCPS can better edge into the international arena. This is probably the first ever comprehensive study in investigating the performance of CCPS in the international market. The research is also timely, particularly when the world is witnessing the burgeoning force of Chinese businesses in many areas including manufacturing, construction, and potentially, professional services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

JASMIN is a super-data-cluster designed to provide a high-performance high-volume data analysis environment for the UK environmental science community. Thus far JASMIN has been used primarily by the atmospheric science and earth observation communities, both to support their direct scientific workflow, and the curation of data products in the STFC Centre for Environmental Data Archival (CEDA). Initial JASMIN configuration and first experiences are reported here. Useful improvements in scientific workflow are presented. It is clear from the explosive growth in stored data and use that there was a pent up demand for a suitable big-data analysis environment. This demand is not yet satisfied, in part because JASMIN does not yet have enough compute, the storage is fully allocated, and not all software needs are met. Plans to address these constraints are introduced.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The flavour profiles of two genotypes of Charentais cantaloupe melons (medium shelf-life and long shelf-life), harvested at two distinct maturities (immature and mature fruit), were investigated. Dynamic headspace extraction (DHE), solid-phase extraction (SPE), gas chromatography–mass spectrometry (GC-MS) and gas chromatography–olfactometry/mass spectrometry (GC-O/MS) were used to determine volatile and semi-volatile compounds. Qualitative descriptive analysis (QDA) was used to assess the organoleptic impact of the different melons and the sensory data were correlated with the chemical analysis. There were significant, consistent and substantial differences between the mature and immature fruit for the medium shelf-life genotype, the less mature giving a green, cucumber character and lacking the sweet, fruity character of the mature fruit. However, maturity at harvest had a much smaller impact on the long shelf-life melons and fewer differences were detected. These long shelf-life melons tasted sweet, but lacked fruity flavours, instead exhibiting a musty, earthy character.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Smart healthcare is a complex domain for systems integration due to human and technical factors and heterogeneous data sources involved. As a part of smart city, it is such a complex area where clinical functions require smartness of multi-systems collaborations for effective communications among departments, and radiology is one of the areas highly relies on intelligent information integration and communication. Therefore, it faces many challenges regarding integration and its interoperability such as information collision, heterogeneous data sources, policy obstacles, and procedure mismanagement. The purpose of this study is to conduct an analysis of data, semantic, and pragmatic interoperability of systems integration in radiology department, and to develop a pragmatic interoperability framework for guiding the integration. We select an on-going project at a local hospital for undertaking our case study. The project is to achieve data sharing and interoperability among Radiology Information Systems (RIS), Electronic Patient Record (EPR), and Picture Archiving and Communication Systems (PACS). Qualitative data collection and analysis methods are used. The data sources consisted of documentation including publications and internal working papers, one year of non-participant observations and 37 interviews with radiologists, clinicians, directors of IT services, referring clinicians, radiographers, receptionists and secretary. We identified four primary phases of data analysis process for the case study: requirements and barriers identification, integration approach, interoperability measurements, and knowledge foundations. Each phase is discussed and supported by qualitative data. Through the analysis we also develop a pragmatic interoperability framework that summaries the empirical findings and proposes recommendations for guiding the integration in the radiology context.