866 resultados para Systematic Analysis of Change in Restaurant Operations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The long-lived radionuclide 129I (T 1/2 = 15.7 My) occurs in the nature in very low concentrations. Since the middle of our century the environmental levels of 129I have been dramatically changed as a consequence of civil and military use of nuclear fission. Its investigation in environmental materials is of interest for environmental surveillance, retrospective dosimetry and for the use as a natural and man-made fracers of environmental processes. We are comparing two analytical methods which presently are capable of determining 129I in environmental materials, namely radiochemical neutron activation analysis (RNAA) and accelerator mass spectrometry (AMS). Emphasis is laid upon the quality control and detection capabilities for the analysis of 129I in environmental materials. Some applications are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To identify more mutations that can affect the early development of Myxococcus xanthus, the synthetic transposon TnT41 was designed and constructed. By virtue of its special features, it can greatly facilitate the processes of mutation screening/selection, mapping, cloning and DNA sequencing. In addition, it allows for the systematic discovery of genes in regulatory hierarchies using their target promoters. In this study, the minimal regulatory region of the early developmentally regulated gene 4521 was used as a reporter in the TnT41 mutagenesis. Both positive (P) mutations and negative (N) mutations were isolated based on their effects on 4521 expression.^ Four of these mutations, i.e. N1, N2, P52 and P54 were analyzed in detail. Mutations N1 and N2 are insertion mutations in a gene designated sasB. The sasB gene is also identified in this study by genetic and molecular analysis of five UV-generated 4521 suppressor mutations. The sasB gene encodes a protein without meaningful homology in the databases. The sasB gene negatively regulates 4521 expression possibly through the SasS-SasR two component system. A wild-type sasB gene is required for normal M. xanthus fruiting body formation and sporulation.^ Cloning and sequencing analysis of the P52 mutation led to the identification of an operon that encodes the M. xanthus high-affinity branched-chain amino acid transporter system. This liv operon consists of five genes designated livK, livH, livM, livC, and livF, respectively. The Liv proteins are highly similar to their counterparts from other bacteria in both amino acid sequences, functional motifs and predicted secondary structures. This system is required for development since liv null mutations cause abnormality in fruiting body formation and a 100-fold decrease in sporulation efficiency.^ Mutation P54 is a TnT41 insertion in the sscM gene of the ssc chemotaxis system, which has been independently identified by Dr. Shi's lab. The sscM gene encodes a MCP (methyl-accepting chemotaxis protein) homologue. The SscM protein is predicted to contain two transmembrane domains, a signaling domain and at least one putative methylation site. Null mutations of this gene abolish the aggregation of starving cells at a very early stage, though the sporulation levels of the mutant can reach 10% that of wild-type cells. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design of shell and spatial structures represents an important challenge even with the use of the modern computer technology.If we concentrate in the concrete shell structures many problems must be faced,such as the conceptual and structural disposition, optimal shape design, analysis, construction methods, details etc. and all these problems are interconnected among them. As an example the shape optimization requires the use of several disciplines like structural analysis, sensitivity analysis, optimization strategies and geometrical design concepts. Similar comments can be applied to other space structures such as steel trusses with single or double shape and tension structures. In relation to the analysis the Finite Element Method appears to be the most extended and versatile technique used in the practice. In the application of this method several issues arise. First the derivation of the pertinent shell theory or alternatively the degenerated 3-D solid approach should be chosen. According to the previous election the suitable FE model has to be adopted i.e. the displacement,stress or mixed formulated element. The good behavior of the shell structures under dead loads that are carried out towards the supports by mainly compressive stresses is impaired by the high imperfection sensitivity usually exhibited by these structures. This last effect is important particularly if large deformation and material nonlinearities of the shell may interact unfavorably, as can be the case for thin reinforced shells. In this respect the study of the stability of the shell represents a compulsory step in the analysis. Therefore there are currently very active fields of research such as the different descriptions of consistent nonlinear shell models given by Simo, Fox and Rifai, Mantzenmiller and Buchter and Ramm among others, the consistent formulation of efficient tangent stiffness as the one presented by Ortiz and Schweizerhof and Wringgers, with application to concrete shells exhibiting creep behavior given by Scordelis and coworkers; and finally the development of numerical techniques needed to trace the nonlinear response of the structure. The objective of this paper is concentrated in the last research aspect i.e. in the presentation of a state-of-the-art on the existing solution techniques for nonlinear analysis of structures. In this presentation the following excellent reviews on this subject will be mainly used.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Ising problem consists in finding the analytical solution of the partition function of a lattice once the interaction geometry among its elements is specified. No general analytical solution is available for this problem, except for the one-dimensional case. Using site-specific thermodynamics, it is shown that the partition function for ligand binding to a two-dimensional lattice can be obtained from those of one-dimensional lattices with known solution. The complexity of the lattice is reduced recursively by application of a contact transformation that involves a relatively small number of steps. The transformation implemented in a computer code solves the partition function of the lattice by operating on the connectivity matrix of the graph associated with it. This provides a powerful new approach to the Ising problem, and enables a systematic analysis of two-dimensional lattices that model many biologically relevant phenomena. Application of this approach to finite two-dimensional lattices with positive cooperativity indicates that the binding capacity per site diverges as Na (N = number of sites in the lattice) and experiences a phase-transition-like discontinuity in the thermodynamic limit N → ∞. The zeroes of the partition function tend to distribute on a slightly distorted unit circle in complex plane and approach the positive real axis already for a 5×5 square lattice. When the lattice has negative cooperativity, its properties mimic those of a system composed of two classes of independent sites with the apparent population of low-affinity binding sites increasing with the size of the lattice, thereby accounting for a phenomenon encountered in many ligand-receptor interactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Microarrays can measure the expression of thousands of genes to identify changes in expression between different biological states. Methods are needed to determine the significance of these changes while accounting for the enormous number of genes. We describe a method, Significance Analysis of Microarrays (SAM), that assigns a score to each gene on the basis of change in gene expression relative to the standard deviation of repeated measurements. For genes with scores greater than an adjustable threshold, SAM uses permutations of the repeated measurements to estimate the percentage of genes identified by chance, the false discovery rate (FDR). When the transcriptional response of human cells to ionizing radiation was measured by microarrays, SAM identified 34 genes that changed at least 1.5-fold with an estimated FDR of 12%, compared with FDRs of 60 and 84% by using conventional methods of analysis. Of the 34 genes, 19 were involved in cell cycle regulation and 3 in apoptosis. Surprisingly, four nucleotide excision repair genes were induced, suggesting that this repair pathway for UV-damaged DNA might play a previously unrecognized role in repairing DNA damaged by ionizing radiation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ethnopharmacological relevance and background: “Dictamnus” was a popular name for a group of medicinal herbaceous plant species of the Rutaceae and Lamiaceae, which since the 4th century have been used for gynaecological problems and other illnesses BCE and still appear in numerous ethnobotanical records. Aims: This research has as four overarching aims: Determining the historical evolution of medical preparations labelled “Dictamnus” and the different factors affecting this long-standing herbal tradition. Deciphering and differentiating those medicinal uses of “Dictamnus” which strictly correspond to Dictamnus (Rutaceae), from those of Origanum dictamnus and other Lamiaceae species. Quantitatively assessing the dependence from herbal books, and pharmaceutical tradition, of modern Dictamnus ethnobotanical records. Determining whether differences between Western and Eastern Europe exist with regards to the Dictamnus albus uses in ethnopharmacology and ethnomedicine. Methods: An exhaustive review of herbals, classical pharmacopoeias, ethnobotanical and ethnopharmacological literature was conducted. Systematic analysis of uses reported which were standardized according to International Classification of Diseases – 10 and multivariate analysis using factorial, hierarchical and neighbour joining methods was undertaken. Results and discussion: The popular concept “Dictamnus” includes Origanum dictamnus L., Ballota pseudodictamnus (L.) Benth. and B. acetabulosa (L.) Benth. (Lamiaceae), as well as Dictamnus albus L. and D. hispanicus Webb ex Willk. (Rutaceae), with 86 different types of uses. Between 1000 and 1700 CE numerous complex preparations with “Dictamnus” were used in the treatment of 35 different pathologies. On biogeographical grounds the widespread D. albus is a far more likely prototypical “Dictamnus” than the Cretan endemic Origanum dictamnus. However both form integral parts of the “Dictamnus” complex. Evidence exists for a sufficiently long and coherent tradition for D. albus and D. hispanicus, use to treat 47 different categories of diseases. Conclusions: This approach is a model for understanding the cultural history of plants and their role as resources for health care. “Dictamnus” shows how transmission of traditional knowledge about materia medica, over 26 centuries, represents remarkable levels of development and innovation. All this lead us to call attention to D. albus and D. hispanicus which are highly promising as potential herbal drug leads. The next steps of research should be to systematically analyse phytochemical, pharmacological and clinical evidence and to develop safety, pharmacology and toxicology profiles of the traditional preparations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Projected air and ground temperatures are expected to be higher in Arctic and sub-Arcticlatitudes and with temperatures already close to the limit where permafrost can exist,resistance against degradation is low. With thawing permafrost, the landscape is modifiedwith depression in which thermokarst lakes emerge. In permafrost soils a considerableamount of soil organic carbon is stored, with the potential of altering climate even furtherif expansion and formation of new thermokarst lakes emerge, as decay releasesgreenhouse gases (C02 and CH4) to the atmosphere. Analyzing the spatial distribution andmorphometry over time of thermokarst lakes and other water bodies, is of importance inaccurately predict carbon budget and feedback mechanisms, as well as to assess futurelandscape layout and these features interaction. Different types of high-spatial resolutionaerial and satellite imageries from 1963, 1975, 2003, 2010 and 2015, were used in bothpre- and post-classification change detection analyses. Using object oriented segmentationin eCognition combined with manual adjustments, resulted in digitalized water bodies>28m2 from which direction of change and morphometric values were extracted. Thequantity of thermokarst lakes and other water bodies was in 1963 n=92, with succeedingyears as a trend decreased in numbers, until 2010-2015 when eleven water bodies wereadded in 2015 (n=74 to n=85). In 1963-2003, area of these water bodies decreased with50 651m2 (189 446-138 795m2) and continued to decrease in 2003-2015 ending at 129337m2. Limnicity decreased from 19.9% in 1963 to 14.6% in 2003 (-5.3%). In 2010 and2015 13.7-13.6%. The late increase in water bodies differs from an earlier hypothesis thatsporadic permafrost regions experience decrease in both area and quantity of thermokarstlakes and water bodies. During 1963-2015, land gain has been in dominance of the ratiobetween the two competing processes of expansion and drainage. In 1963-1975, 55/45%,followed by 90/10% in 1975-2003. After major drainage events, land loss increased to62/38% in 2010-2015. Drainage and infilling rates, calculated for 15 shorelines werevaried across both landscape and parts of shorelines, with in average 0.17/0.15/0.14m/yr.Except for 1963-1975 when rate of change in average was in opposite direction (-0.09m/yr.), likely due to evident expansion of a large thermokarst lake. Using a squaregrid, distribution of water bodies was determined, with an indistinct cluster located in NEand central parts. Especially for water bodies <250m2, which is the dominant area classthroughout 1963-2015 ranging from n=39-51. With a heterogeneous composition of bothsmall and large thermokarst lakes, and with both expansion and drainage altering thelandscape in Tavvavuoma, both positive and negative climate feedback mechanisms are inplay - given that sporadic permafrost still exist.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A retrospective audit was conducted in 1998 and 2000 to review the physiotherapy management of hospitalized children with cystic fibrosis (CF) at the Brisbane Royal Children's Hospital (RCH). The objective was to detect and explore possible changes in patient management in this time period and investigate whether these changes reflected changes in the current theory of CF management. All children over two years of age with CF admitted during 1998 and 2000 with pulmonary manifestation and who satisfied set criteria were included (n = 249). Relative frequency of each of six treatment modalities used were examined on two occasions, revealing some degree of change in practice reflecting the changes in current theory. There was a significant decrease in the frequency of usage of postural drainage with head-down tilt (p < 0.001), and autogenic drainage (p < 0.001) between 1998 and 2000. Modified postural drainage without head-down tilt (p < 0.001), and positive expiratory pressure devices (p < 0.001) were used more frequently in 2000 (p < 0.001). No significant changes were identified in the use of Flutter VRP1 (p = 0.145) and exercise (p = 0.763). No significant differences were found in population demographics or occurrence of concomitant factors that may influence patient management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In any investigation in optometry involving more that two treatment or patient groups, an investigator should be using ANOVA to analyse the results assuming that the data conform reasonably well to the assumptions of the analysis. Ideally, specific null hypotheses should be built into the experiment from the start so that the treatments variation can be partitioned to test these effects directly. If 'post-hoc' tests are used, then an experimenter should examine the degree of protection offered by the test against the possibilities of making either a type 1 or a type 2 error. All experimenters should be aware of the complexity of ANOVA. The present article describes only one common form of the analysis, viz., that which applies to a single classification of the treatments in a randomised design. There are many different forms of the analysis each of which is appropriate to the analysis of a specific experimental design. The uses of some of the most common forms of ANOVA in optometry have been described in a further article. If in any doubt, an investigator should consult a statistician with experience of the analysis of experiments in optometry since once embarked upon an experiment with an unsuitable design, there may be little that a statistician can do to help.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The ALBA 2002 Call for Papers asks the question ‘How do organizational learning and knowledge management contribute to organizational innovation and change?’. Intuitively, we would argue, the answer should be relatively straightforward as links between learning and change, and knowledge management and innovation, have long been commonly assumed to exist. On the basis of this assumption, theories of learning tend to focus ‘within organizations’, and assume a transfer of learning from individual to organization which in turn leads to change. However, empirically, we find these links are more difficult to articulate. Organizations exist in complex embedded economic, political, social and institutional systems, hence organizational change (or innovation) may be influenced by learning in this wider context. Based on our research in this wider interorganizational setting, we first make the case for the notion of network learning that we then explore to develop our appreciation of change in interorganizational networks, and how it may be facilitated. The paper begins with a brief review of lite rature on learning in the organizational and interorganizational context which locates our stance on organizational learning versus the learning organization, and social, distributed versus technical, centred views of organizational learning and knowledge. Developing from the view that organizational learning is “a normal, if problematic, process in every organization” (Easterby-Smith, 1997: 1109), we introduce the notion of network learning: learning by a group of organizations as a group. We argue this is also a normal, if problematic, process in organizational relationships (as distinct from interorganizational learning), which has particular implications for network change. Part two of the paper develops our analysis, drawing on empirical data from two studies of learning. The first study addresses the issue of learning to collaborate between industrial customers and suppliers, leading to the case for network learning. The second, larger scale study goes on to develop this theme, examining learning around several major change issues in a healthcare service provider network. The learning processes and outcomes around the introduction of a particularly controversial and expensive technology are described, providing a rich and contrasting case with the first study. In part three, we then discuss the implications of this work for change, and for facilitating change. Conclusions from the first study identify potential interventions designed to facilitate individual and organizational learning within the customer organization to develop individual and organizational ‘capacity to collaborate’. Translated to the network example, we observe that network change entails learning at all levels – network, organization, group and individual. However, presenting findings in terms of interventions is less meaningful in an interorganizational network setting given: the differences in authority structures; the less formalised nature of the network setting; and the importance of evaluating performance at the network rather than organizational level. Academics challenge both the idea of managing change and of managing networks. Nevertheless practitioners are faced with the issue of understanding and in fluencing change in the network setting. Thus we conclude that a network learning perspective is an important development in our understanding of organizational learning, capability and change, locating this in the wider context in which organizations are embedded. This in turn helps to develop our appreciation of facilitating change in interorganizational networks, both in terms of change issues (such as introducing a new technology), and change orientation and capability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sentiment lexicons for sentiment analysis offer a simple, yet effective way to obtain the prior sentiment information of opinionated words in texts. However, words' sentiment orientations and strengths often change throughout various contexts in which the words appear. In this paper, we propose a lexicon adaptation approach that uses the contextual semantics of words to capture their contexts in tweet messages and update their prior sentiment orientations and/or strengths accordingly. We evaluate our approach on one state-of-the-art sentiment lexicon using three different Twitter datasets. Results show that the sentiment lexicons adapted by our approach outperform the original lexicon in accuracy and F-measure in two datasets, but give similar accuracy and slightly lower F-measure in one dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sentiment lexicons for sentiment analysis offer a simple, yet effective way to obtain the prior sentiment information of opinionated words in texts. However, words’ sentiment orientations and strengths often change throughout various contexts in which the words appear. In this paper, we propose a lexicon adaptation approach that uses the contextual semantics of words to capture their contexts in tweet messages and update their prior sentiment orientations and/or strengths accordingly. We evaluate our approach on one state-of-the-art sentiment lexicon using three different Twitter datasets. Results show that the sentiment lexicons adapted by our approach outperform the original lexicon in accuracy and F-measure in two datasets, but give similar accuracy and slightly lower F-measure in one dataset.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lexicon-based approaches to Twitter sentiment analysis are gaining much popularity due to their simplicity, domain independence, and relatively good performance. These approaches rely on sentiment lexicons, where a collection of words are marked with fixed sentiment polarities. However, words' sentiment orientation (positive, neural, negative) and/or sentiment strengths could change depending on context and targeted entities. In this paper we present SentiCircle; a novel lexicon-based approach that takes into account the contextual and conceptual semantics of words when calculating their sentiment orientation and strength in Twitter. We evaluate our approach on three Twitter datasets using three different sentiment lexicons. Results show that our approach significantly outperforms two lexicon baselines. Results are competitive but inconclusive when comparing to state-of-art SentiStrength, and vary from one dataset to another. SentiCircle outperforms SentiStrength in accuracy on average, but falls marginally behind in F-measure. © 2014 Springer International Publishing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Engineering education in the United Kingdom is at the point of embarking upon an interesting journey into uncharted waters. At no point in the past have there been so many drivers for change and so many opportunities for the development of engineering pedagogy. This paper will look at how Engineering Education Research (EER) has developed within the UK and what differentiates it from the many small scale practitioner interventions, perhaps without a clear research question or with little evaluation, which are presented at numerous staff development sessions, workshops and conferences. From this position some examples of current projects will be described, outcomes of funding opportunities will be summarised and the benefits of collaboration with other disciplines illustrated. In this study, I will account for how the design of task structure according to variation theory, as well as the probe-ware technology, make the laws of force and motion visible and learnable and, especially, in the lab studied make Newton's third law visible and learnable. I will also, as a comparison, include data from a mechanics lab that use the same probe-ware technology and deal with the same topics in mechanics, but uses a differently designed task structure. I will argue that the lower achievements on the FMCE-test in this latter case can be attributed to these differences in task structure in the lab instructions. According to my analysis, the necessary pattern of variation is not included in the design. I will also present a microanalysis of 15 hours collected from engineering students' activities in a lab about impulse and collisions based on video recordings of student's activities in a lab about impulse and collisions. The important object of learning in this lab is the development of an understanding of Newton's third law. The approach analysing students interaction using video data is inspired by ethnomethodology and conversation analysis, i.e. I will focus on students practical, contingent and embodied inquiry in the setting of the lab. I argue that my result corroborates variation theory and show this theory can be used as a 'tool' for designing labs as well as for analysing labs and lab instructions. Thus my results have implications outside the domain of this study and have implications for understanding critical features for student learning in labs. Engineering higher education is well used to change. As technology develops the abilities expected by employers of graduates expand, yet our understanding of how to make informed decisions about learning and teaching strategies does not without a conscious effort to do so. With the numerous demands of academic life, we often fail to acknowledge our incomplete understanding of how our students learn within our discipline. The journey facing engineering education in the UK is being driven by two classes of driver. Firstly there are those which we have been working to expand our understanding of, such as retention and employability, and secondly the new challenges such as substantial changes to funding systems allied with an increase in student expectations. Only through continued research can priorities be identified, addressed and a coherent and strong voice for informed change be heard within the wider engineering education community. This new position makes it even more important that through EER we acquire the knowledge and understanding needed to make informed decisions regarding approaches to teaching, curriculum design and measures to promote effective student learning. This then raises the question 'how does EER function within a diverse academic community?' Within an existing community of academics interested in taking meaningful steps towards understanding the ongoing challenges of engineering education a Special Interest Group (SIG) has formed in the UK. The formation of this group has itself been part of the rapidly changing environment through its facilitation by the Higher Education Academy's Engineering Subject Centre, an entity which through the Academy's current restructuring will no longer exist as a discrete Centre dedicated to supporting engineering academics. The aims of this group, the activities it is currently undertaking and how it expects to network and collaborate with the global EER community will be reported in this paper. This will include explanation of how the group has identified barriers to the progress of EER and how it is seeking, through a series of activities, to facilitate recognition and growth of EER both within the UK and with our valued international colleagues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work, desorption/ionization mass spectrometry was employed for the analysis of sugars and small platform chemicals that are common intermediates in biomass transformation reactions. Specifically, matrix-assisted laser desorption/ionization (MALDI) and desorption electrospray ionization (DESI) mass spectrometric techniques were employed as alternatives to traditional chromatographic methods. Ionic liquid matrices (ILMs) were designed based on traditional solid MALDI matrices (2,5-dihydroxybenzoic acid (DHB) and α-cyano-4-hydroxycinnamic acid (CHCA)) and 1,3-dialkylimidazolium ionic liquids ([BMIM]Cl, [EMIM]Cl, and [EMIM]OAc) that have been employed as reaction media for biomass transformation reactions such as the conversion of carbohydrates to valuable platform chemicals. Although two new ILMs were synthesized ([EMIM][DHB] and [EMIM][CHCA] from [EMIM]OAc), chloride-containing ILs did not react with matrices and resulted in mixtures of IL and matrix in solution. Compared to the parent solid matrices, much less matrix interference was observed in the low mass region of the mass spectrum (< 500 Da) using each of the IL-matrices. Furthermore, the formation of a true ILM (i.e. a new ion pair) does not appear to be necessary for analyte ionization. MALDI sample preparation techniques were optimized based on the compatibility with analyte, IL and matrix. ILMs and IL-matrix mixtures of DHB allowed for qualitative analysis of glucose, fructose, sucrose and N-acetyl-D-glucosamine. Analogous CHCA-containing ILMs did not result in appreciable analyte signals under similar conditions. Small platform compounds such as 5-hydroxymethylfurfural (HMF) and levulinic acid were not detected by direct analysis using MALDI-MS. Furthermore, sugar analyte signals were only detected at relatively high matrix:IL:analyte ratios (1:1:1) due to significant matrix and analyte suppression by the IL ions. Therefore, chemical modification of analytes with glycidyltrimethylammonium chloride (GTMA) was employed to extend this method to quantitative applications. Derivatization was accomplished in aqueous IL solutions with fair reaction efficiencies (36.9 – 48.4 % glucose conversion). Calibration curves of derivatized glucose-GTMA yielded good linearity in all solvent systems tested, with decreased % RSDs of analyte ion signals in IL solutions as compared to purely aqueous systems (1.2 – 7.2 % and 4.2 – 8.7 %, respectively). Derivatization resulted in a substantial increase in sensitivity for MALDI-MS analyses: glucose was reliably detected at IL:analyte ratios of 100:1 (as compared to 1:1 prior to derivatization). Screening of all test analytes resulted in appreciable analyte signals in MALDI-MS spectra, including both HMF and levulinic acid. Using appropriate internal standards, calibration curves were constructed and this method was employed for monitoring a model dehydration reaction of fructose to HMF in [BMIM]Cl. Calibration curves showed wide dynamic ranges (LOD – 100 ng fructose/μg [BMIM]Cl, LOD – 75 ng HMF/μg [BMIM]Cl) with correlation coefficients of 0.9973 (fructose) and 0.9931 (HMF). LODs were estimated from the calibration data to be 7.2 ng fructose/μg [BMIM]Cl and 7.5 ng HMF/μg [BMIM]Cl, however relatively high S/N ratios at these concentrations indicate that these values are likely overestimated. Application of this method allowed for the rapid acquisition of quantitative data without the need for prior separation of analyte and IL. Finally, small molecule platform chemicals HMF and levulinic acid were qualitatively analyzed by DESI-MS. Both HMF and levulinic acid were easily ionized and the corresponding molecular ions were easily detected in the presence of 10 – 100 times IL, without the need for chemical modification prior to analysis. DESI-MS analysis of ILs in positive and negative ion modes resulted in few ions in the low mass region, showing great potential for the analysis of small molecules in IL media.