948 resultados para extraction and separation techniques
Resumo:
Carnitine is an amino acid derivative that plays a key role in energy metabolism. Endogenous carnitine is found in its free form or esterified with acyl groups of several chain lengths. Quantification of carnitine and acylcarnitines is of particular interest for screening for research and metabolic disorders. We developed a method with online solid-phase extraction coupled to high-performance liquid chromatography and tandem mass spectrometry to quantify carnitine and three acylcarnitines with different polarity (acetylcarnitine, octanoylcarnitine, and palmitoylcarnitine). Plasma samples were deproteinized with methanol, loaded on a cation exchange trapping column and separated on a reversed-phase C8 column using heptafluorobutyric acid as an ion-pairing reagent. Considering the endogenous nature of the analytes, we quantified with the standard addition method and with external deuterated standards. Solid-phase extraction and separation were achieved within 8 min. Recoveries of carnitine and acylcarnitines were between 98 and 105 %. Both quantification methods were equally accurate (all values within 84 to 116 % of target concentrations) and precise (day-to-day variation of less than 18 %) for all carnitine species and concentrations analyzed. The method was used successfully for determination of carnitine and acylcarnitines in different human samples. In conclusion, we present a method for simultaneous quantification of carnitine and acylcarnitines with a rapid sample work-up. This approach requires small sample volumes and a short analysis time, and it can be applied for the determination of other acylcarnitines than the acylcarnitines tested. The method is useful for applications in research and clinical routine.
Resumo:
In this thesis, I study skin lesion detection and its applications to skin cancer diagnosis. A skin lesion detection algorithm is proposed. The proposed algorithm is based color information and threshold. For the proposed algorithm, several color spaces are studied and the detection results are compared. Experimental results show that YUV color space can achieve the best performance. Besides, I develop a distance histogram based threshold selection method and the method is proven to be better than other adaptive threshold selection methods for color detection. Besides the detection algorithms, I also investigate GPU speed-up techniques for skin lesion extraction and the results show that GPU has potential applications in speeding-up skin lesion extraction. Based on the skin lesion detection algorithms proposed, I developed a mobile-based skin cancer diagnosis application. In this application, the user with an iPhone installed with the proposed application can use the iPhone as a diagnosis tool to find the potential skin lesions in a persons' skin and compare the skin lesions detected by the iPhone with the skin lesions stored in a database in a remote server.
Resumo:
Randomised controlled trials (RCTs) of psychotherapeutic interventions assume that specific techniques are used in treatments, which are responsible for changes in the client's symptoms. This assumption also holds true for meta-analyses, where evidence for specific interventions and techniques is compiled. However, it has also been argued that different treatments share important techniques and that an upcoming consensus about useful treatment strategies is leading to a greater integration of treatments. This makes assumptions about the effectiveness of specific interventions ingredients questionable if the shared (common) techniques are more often used in interventions than are the unique techniques. This study investigated the unique or shared techniques in RCTs of cognitive-behavioural therapy (CBT) and short-term psychodynamic psychotherapy (STPP). Psychotherapeutic techniques were coded from 42 masked treatment descriptions of RCTs in the field of depression (1979-2010). CBT techniques were often used in studies identified as either CBT or STPP. However, STPP techniques were only used in STPP-identified studies. Empirical clustering of treatment descriptions did not confirm the original distinction of CBT versus STPP, but instead showed substantial heterogeneity within both approaches. Extraction of psychotherapeutic techniques from the treatment descriptions is feasible and could be used as a content-based approach to classify treatments in systematic reviews and meta-analyses.
Resumo:
The chemical and isotopic characterization of porewater residing in the inter- and intragranular pore space of the low-permeability rock matrix is an important component with respect to the site characterization and safety assessment of potential host rocks for a radioactive waste disposal. The chemical and isotopic composition of porewater in such low permeability rocks has to be derived by indirect extraction techniques applied to naturally saturated rock material. In most of such indirect extraction techniques – especially in case of rocks of a porosity below about 2 vol.% – the original porewater concentrations are diluted and need to be back-calculated to in-situ concentrations. This requires a well-defined value for the connected porosity – accessible to different solutes under in-situ conditions. The derivation of such porosity values, as well as solute concentrations, is subject to various perturbations during drilling, core sampling, storage and experiments in the laboratory. The present study aims to demonstrate the feasibility of a variety of these techniques to charac-terize porewater and solute transport in crystalline rocks. The methods, which have been de-veloped during multiple porewater studies in crystalline environments, were applied on four core samples from the deep borehole DH-GAP04, drilled in the Kangerlussuaq area, Southwest Greenland, as part of the joint NWMO–Posiva–SKB Greenland Analogue Project (GAP). Potential artefacts that can influence the estimation of in situ porewater chemistry and isotopes, as well as their controls, are described in detail in this report, using specific examples from borehole DH-GAP04
Resumo:
La nanotecnología es un área de investigación de reciente creación que trata con la manipulación y el control de la materia con dimensiones comprendidas entre 1 y 100 nanómetros. A escala nanométrica, los materiales exhiben fenómenos físicos, químicos y biológicos singulares, muy distintos a los que manifiestan a escala convencional. En medicina, los compuestos miniaturizados a nanoescala y los materiales nanoestructurados ofrecen una mayor eficacia con respecto a las formulaciones químicas tradicionales, así como una mejora en la focalización del medicamento hacia la diana terapéutica, revelando así nuevas propiedades diagnósticas y terapéuticas. A su vez, la complejidad de la información a nivel nano es mucho mayor que en los niveles biológicos convencionales (desde el nivel de población hasta el nivel de célula) y, por tanto, cualquier flujo de trabajo en nanomedicina requiere, de forma inherente, estrategias de gestión de información avanzadas. Desafortunadamente, la informática biomédica todavía no ha proporcionado el marco de trabajo que permita lidiar con estos retos de la información a nivel nano, ni ha adaptado sus métodos y herramientas a este nuevo campo de investigación. En este contexto, la nueva área de la nanoinformática pretende detectar y establecer los vínculos existentes entre la medicina, la nanotecnología y la informática, fomentando así la aplicación de métodos computacionales para resolver las cuestiones y problemas que surgen con la información en la amplia intersección entre la biomedicina y la nanotecnología. Las observaciones expuestas previamente determinan el contexto de esta tesis doctoral, la cual se centra en analizar el dominio de la nanomedicina en profundidad, así como en el desarrollo de estrategias y herramientas para establecer correspondencias entre las distintas disciplinas, fuentes de datos, recursos computacionales y técnicas orientadas a la extracción de información y la minería de textos, con el objetivo final de hacer uso de los datos nanomédicos disponibles. El autor analiza, a través de casos reales, alguna de las tareas de investigación en nanomedicina que requieren o que pueden beneficiarse del uso de métodos y herramientas nanoinformáticas, ilustrando de esta forma los inconvenientes y limitaciones actuales de los enfoques de informática biomédica a la hora de tratar con datos pertenecientes al dominio nanomédico. Se discuten tres escenarios diferentes como ejemplos de actividades que los investigadores realizan mientras llevan a cabo su investigación, comparando los contextos biomédico y nanomédico: i) búsqueda en la Web de fuentes de datos y recursos computacionales que den soporte a su investigación; ii) búsqueda en la literatura científica de resultados experimentales y publicaciones relacionadas con su investigación; iii) búsqueda en registros de ensayos clínicos de resultados clínicos relacionados con su investigación. El desarrollo de estas actividades requiere el uso de herramientas y servicios informáticos, como exploradores Web, bases de datos de referencias bibliográficas indexando la literatura biomédica y registros online de ensayos clínicos, respectivamente. Para cada escenario, este documento proporciona un análisis detallado de los posibles obstáculos que pueden dificultar el desarrollo y el resultado de las diferentes tareas de investigación en cada uno de los dos campos citados (biomedicina y nanomedicina), poniendo especial énfasis en los retos existentes en la investigación nanomédica, campo en el que se han detectado las mayores dificultades. El autor ilustra cómo la aplicación de metodologías provenientes de la informática biomédica a estos escenarios resulta efectiva en el dominio biomédico, mientras que dichas metodologías presentan serias limitaciones cuando son aplicadas al contexto nanomédico. Para abordar dichas limitaciones, el autor propone un enfoque nanoinformático, original, diseñado específicamente para tratar con las características especiales que la información presenta a nivel nano. El enfoque consiste en un análisis en profundidad de la literatura científica y de los registros de ensayos clínicos disponibles para extraer información relevante sobre experimentos y resultados en nanomedicina —patrones textuales, vocabulario en común, descriptores de experimentos, parámetros de caracterización, etc.—, seguido del desarrollo de mecanismos para estructurar y analizar dicha información automáticamente. Este análisis concluye con la generación de un modelo de datos de referencia (gold standard) —un conjunto de datos de entrenamiento y de test anotados manualmente—, el cual ha sido aplicado a la clasificación de registros de ensayos clínicos, permitiendo distinguir automáticamente los estudios centrados en nanodrogas y nanodispositivos de aquellos enfocados a testear productos farmacéuticos tradicionales. El presente trabajo pretende proporcionar los métodos necesarios para organizar, depurar, filtrar y validar parte de los datos nanomédicos existentes en la actualidad a una escala adecuada para la toma de decisiones. Análisis similares para otras tareas de investigación en nanomedicina ayudarían a detectar qué recursos nanoinformáticos se requieren para cumplir los objetivos actuales en el área, así como a generar conjunto de datos de referencia, estructurados y densos en información, a partir de literatura y otros fuentes no estructuradas para poder aplicar nuevos algoritmos e inferir nueva información de valor para la investigación en nanomedicina. ABSTRACT Nanotechnology is a research area of recent development that deals with the manipulation and control of matter with dimensions ranging from 1 to 100 nanometers. At the nanoscale, materials exhibit singular physical, chemical and biological phenomena, very different from those manifested at the conventional scale. In medicine, nanosized compounds and nanostructured materials offer improved drug targeting and efficacy with respect to traditional formulations, and reveal novel diagnostic and therapeutic properties. Nevertheless, the complexity of information at the nano level is much higher than the complexity at the conventional biological levels (from populations to the cell). Thus, any nanomedical research workflow inherently demands advanced information management. Unfortunately, Biomedical Informatics (BMI) has not yet provided the necessary framework to deal with such information challenges, nor adapted its methods and tools to the new research field. In this context, the novel area of nanoinformatics aims to build new bridges between medicine, nanotechnology and informatics, allowing the application of computational methods to solve informational issues at the wide intersection between biomedicine and nanotechnology. The above observations determine the context of this doctoral dissertation, which is focused on analyzing the nanomedical domain in-depth, and developing nanoinformatics strategies and tools to map across disciplines, data sources, computational resources, and information extraction and text mining techniques, for leveraging available nanomedical data. The author analyzes, through real-life case studies, some research tasks in nanomedicine that would require or could benefit from the use of nanoinformatics methods and tools, illustrating present drawbacks and limitations of BMI approaches to deal with data belonging to the nanomedical domain. Three different scenarios, comparing both the biomedical and nanomedical contexts, are discussed as examples of activities that researchers would perform while conducting their research: i) searching over the Web for data sources and computational resources supporting their research; ii) searching the literature for experimental results and publications related to their research, and iii) searching clinical trial registries for clinical results related to their research. The development of these activities will depend on the use of informatics tools and services, such as web browsers, databases of citations and abstracts indexing the biomedical literature, and web-based clinical trial registries, respectively. For each scenario, this document provides a detailed analysis of the potential information barriers that could hamper the successful development of the different research tasks in both fields (biomedicine and nanomedicine), emphasizing the existing challenges for nanomedical research —where the major barriers have been found. The author illustrates how the application of BMI methodologies to these scenarios can be proven successful in the biomedical domain, whilst these methodologies present severe limitations when applied to the nanomedical context. To address such limitations, the author proposes an original nanoinformatics approach specifically designed to deal with the special characteristics of information at the nano level. This approach consists of an in-depth analysis of the scientific literature and available clinical trial registries to extract relevant information about experiments and results in nanomedicine —textual patterns, common vocabulary, experiment descriptors, characterization parameters, etc.—, followed by the development of mechanisms to automatically structure and analyze this information. This analysis resulted in the generation of a gold standard —a manually annotated training or reference set—, which was applied to the automatic classification of clinical trial summaries, distinguishing studies focused on nanodrugs and nanodevices from those aimed at testing traditional pharmaceuticals. The present work aims to provide the necessary methods for organizing, curating and validating existing nanomedical data on a scale suitable for decision-making. Similar analysis for different nanomedical research tasks would help to detect which nanoinformatics resources are required to meet current goals in the field, as well as to generate densely populated and machine-interpretable reference datasets from the literature and other unstructured sources for further testing novel algorithms and inferring new valuable information for nanomedicine.
Resumo:
National Highway Traffic Safety Administration, Research Institute, Washington, D.C.
Resumo:
PURPOSE: To evaluate theoretically three previously published formulae that use intra-operative aphakic refractive error to calculate intraocular lens (IOL) power, not necessitating pre-operative biometry. The formulae are as follows: IOL power (D) = Aphakic refraction x 2.01 [Ianchulev et al., J. Cataract Refract. Surg.31 (2005) 1530]; IOL power (D) = Aphakic refraction x 1.75 [Mackool et al., J. Cataract Refract. Surg.32 (2006) 435]; IOL power (D) = 0.07x(2) + 1.27x + 1.22, where x = aphakic refraction [Leccisotti, Graefes Arch. Clin. Exp. Ophthalmol.246 (2008) 729]. METHODS: Gaussian first order calculations were used to determine the relationship between intra-operative aphakic refractive error and the IOL power required for emmetropia in a series of schematic eyes incorporating varying corneal powers, pre-operative crystalline lens powers, axial lengths and post-operative IOL positions. The three previously published formulae, based on empirical data, were then compared in terms of IOL power errors that arose in the same schematic eye variants. RESULTS: An inverse relationship exists between theoretical ratio and axial length. Corneal power and initial lens power have little effect on calculated ratios, whilst final IOL position has a significant impact. None of the three empirically derived formulae are universally accurate but each is able to predict IOL power precisely in certain theoretical scenarios. The formulae derived by Ianchulev et al. and Leccisotti are most accurate for posterior IOL positions, whereas the Mackool et al. formula is most reliable when the IOL is located more anteriorly. CONCLUSION: Final IOL position was found to be the chief determinant of IOL power errors. Although the A-constants of IOLs are known and may be accurate, a variety of factors can still influence the final IOL position and lead to undesirable refractive errors. Optimum results using these novel formulae would be achieved in myopic eyes.
Resumo:
This study is concerned with the analysis of tear proteins, paying particular attention to the state of the tears (e.g. non-stimulated, reflex, closed), created during sampling, and to assess their interactions with hydrogel contact lenses. The work has involved the use of a variety of biochemical and immunological analytical techniques for the measurement of proteins, (a), in tears, (b), on the contact lens, and (c), in the eluate of extracted lenses. Although a diverse range of tear components may contribute to contact lens spoilation, proteins were of particular interest in this study because of their theoretical potential for producing immunological reactions. Although normal host proteins in their natural state are generally not treated as dangerous or non-self, those which undergo denaturation or suffer a conformational change may provoke an excessive and unnecessary immune response. A novel on-lens cell based assay has been developed and exploited in order to study the role of the ubiquitous cell adhesion glycoprotein, vitronectin, in tears and contact lens wear under various parameters. Vitronectin, whose levels are known to increase in the closed eye environment and shown here to increase during contact lens wear, is an important immunoregulatory protein and may be a prominent marker of inflammatory activity. Immunodiffusion assays were developed and optimised for use in tear analysis, and in a series of subsequent studies used for example in the measurement of albumin, lactoferrin, IgA and IgG. The immunodiffusion assays were then applied in the estimation of the closed eye environment; an environment which has been described as sustaining a state of sub-clinical inflammation. The role and presence of a lesser understood and investigated protein, kininogen, was also estimated, in particular, in relation to contact lens wear. Difficulties arise when attempting to extract proteins from the contact lens in order to examine the individual nature of the proteins involved. These problems were partly alleviated with the use of the on-lens cell assay and a UV spectrophotometry assay, which can analyse the lens surface and bulk respectively, the latter yielding only total protein values. Various lens extraction methods were investigated to remove protein from the lens and the most efficient was employed in the analysis of lens extracts. Counter immunoelectrophoresis, an immunodiffusion assay, was then applied to the analysis of albumin, lactoferrin, IgA and IgG in the resultant eluates.
Resumo:
The immune system is able to produce antibodies, which have the capacity to recognize and to bind to foreign molecules or pathogenic organisms. Currently, there are a diversity of diseases that can be treated with antibodies, like immunoglobulins G (IgG). Thereby, the development of cost-efficient processes for their extraction and purification is an area of main interest in biotechnology. Aqueous biphasic systems (ABS) have been investigated for this purpose, once they allow the reduction of costs and the number of steps involved in the process, when compared with conventional methods. Nevertheless, typical ABS have not showed to be selective, resulting in low purification factors and yields. In this context, the addition of ionic liquids (ILs) as adjuvants can be a viable and potential alternative to tailor the selectivity of these systems. In this work, ABS composed of polyethylene glycol (PEG) of different molecular weight, and a biodegradable salt (potassium citrate) using ILs as adjuvants (5 wt%), were studied for the extraction and purification of IgG from a rabbit source. Initially, it was tested the extraction time, the effect on the molecular weight of PEG in a buffer solution of K3C6H5O7/C6H8O7 at pH≈7, and the effect of pH (59) on the yield (YIgG) and extraction efficiency (EEIgG%) of IgG. The best results regarding EEIgG% were achieved with a centrifugation step at 1000 rpm, during 10 min, in order to promote the separation of phases followed by 120 min of equilibrium. This procedure was then applied to the remaining experiments. The results obtained in the study of PEGs with different molecular weights, revealed a high affinity of IgG for the PEG-rich phase, and particularly for PEGs of lower molecular weight (EEIgG% of 96 % with PEG 400). On the other hand, the variation of pH in the buffer solution did not show a significant effect on the EEIgG%. Finally, it was evaluated the influence of the addition of different ILs (5% wt) on the IgG extraction in ABS composed of PEG 400 at pH≈7. In these studies, it was possible to obtain EEIgG% of 100% with the ILs composed of the anions [TOS]-, [CH3CO2]-and Cl-, although the obtained YIgG% were lower than 40%. On the other hand, the ILs composed of the anions Br-, as well as of the cation [C10mim]+, although not leading to EEIgG% of 100%, provide an increase in the YIgG%. ABS composed of PEG, a biodegradable organic salt and ILs as adjuvants, revealed to be an alternative and promising method to purify IgG. However, additional studies are still required in order to reduce the loss of IgG.
Resumo:
Plants frequently suffer contaminations by toxigenic fungi, and their mycotoxins can be produced throughout growth, harvest, drying and storage periods. The objective of this work was to validate a method for detection of toxins in medicinal and aromatic plants, through a fast and highly sensitive method, optimizing the joint co-extraction of aflatoxins (AF: AFB1, AFB2, AFG1 and AFG2) and ochratoxin A (OTA) by using Aloysia citrodora P. (lemon verbena) as a case study. For optimization purposes, samples were spiked (n=3) with standard solutions of a mix of the four AFs and OTA at 10 ng/g for AFB1, AFG1 and OTA, and at 6 ng/g of AFB2 and AFG2. Several extraction procedures were tested: i) ultrasound-assisted extraction in sodium chloride and methanol/water (80:20, v/v) [(OTA+AFs)1]; ii) maceration in methanol/1% NaHCO3 (70:30, v/v) [(OTA+AFs)2]; iii) maceration in methanol/1% NaHCO3 (70:30, v/v) (OTA1); and iv) maceration in sodium chloride and methanol/water (80:20, v/v) (AF1). AF and OTA were purified using the mycotoxin-specific immunoaffinity columns AflaTest WB and OchraTest WB (VICAM), respectively. Separation was performed with a Merck Chromolith Performance C18 column (100 x 4.6 mm) by reverse-phase HPLC coupled to a fluorescence detector (FLD) and a photochemical derivatization system (for AF). The recoveries obtained from the spiked samples showed that the single-extraction methods (OTA1 and AF1) performed better than co-extraction methods. For in-house validation of the selected methods OTA1 and AF1, recovery and precision were determined (n=6). The recovery of OTA for method OTA1 was 81%, and intermediate precision (RSDint) was 1.1%. The recoveries of AFB1, AFB2, AFG1 and AFG2 ranged from 64% to 110% for method AF1, with RSDint lower than 5%. Methods OTA1 and AF1 showed precision and recoveries within the legislated values and were found to be suitable for the extraction of OTA and AF for the matrix under study.
Resumo:
In this work, ionic liquids are evaluated for the first time as solvents for extraction and entrainers in separation processes involving terpenes and terpenoids. For that purpose, activity coefficients at infinite dilution, γ13 ∞, of terpenes and terpenoids, in the ionic liquids [C4mim]Cl, [C4mim][CH3SO3], [C4mim][(CH3)2PO4] and [C4mim][CF3SO3] were determined by gas−liquid chromatography at six temperatures in the range 398.15 to 448.15 K. On the basis of the experimental values, a correlation of γ13 ∞ with an increase of the solubility parameters is proposed. The infinite dilution thermodynamic functions were calculated showing the entropic effect is dominant over the enthalpic. Gas−liquid partition coefficients give indications about the recovery and purification of terpenes and terpenoids from ionic liquid solutions. Presenting a strong innovative character, COSMO-RS was evaluated for the description of the selectivities and capacities, showing to be a useful tool for the screening of ionic liquids in order to find suitable candidates for terpenes and terpenoids extraction, and separation. COSMO-RS predictions show that in order to achieve the maximum separation efficiency, polar anions should be used such as bis(2,4,4-trimethylpentyl)phosphinate or acetate, whereas high capacities require nonpolar cations such as phosphonium.
Resumo:
Doutoramento em Engenharia do Ambiente - Instituto Superior de Agronomia - UL
Resumo:
A better method for determination of shikimate in plant tissues is needed to monitor exposure of plants to the herbicide glyphosate [N-(phosphonomethyl)glycine] and to screen the plant kingdom for high levels of this valuable phytochemical precursor to the pharmaceutical oseltamivir. A simple, rapid, and efficient method using microwave-assisted extraction (MWAE) with water as the extraction solvent was developed for the determination of shikimic acid in plant tissues. High performance liquid chromatography was used for the separation of shikimic acid, and chromatographic data were acquired using photodiode array detection. This MWAE technique was successful in recovering shikimic acid from a series of fortified plant tissues at more than 90% efficiency with an interference-free chromatogram. This allowed the use of lower amounts of reagents and organic solvents, reducing the use of toxic and/or hazardous chemicals, as compared to currently used methodologies. The method was used to determine the level of endogenous shikimic acid in several species of Brachiaria and sugarcane (Saccharum officinarum) and on B. decumbens and soybean (Glycine max) after treatment with glyphosate. The method was sensitive, rapid and reliable in all cases.
Resumo:
Measurement and modeling techniques were developed to improve over-water gaseous air-water exchange measurements for persistent bioaccumulative and toxic chemicals (PBTs). Analytical methods were applied to atmospheric measurements of hexachlorobenzene (HCB), polychlorinated biphenyls (PCBs), and polybrominated diphenyl ethers (PBDEs). Additionally, the sampling and analytical methods are well suited to study semivolatile organic compounds (SOCs) in air with applications related to secondary organic aerosol formation, urban, and indoor air quality. A novel gas-phase cleanup method is described for use with thermal desorption methods for analysis of atmospheric SOCs using multicapillary denuders. The cleanup selectively removed hydrogen-bonding chemicals from samples, including much of the background matrix of oxidized organic compounds in ambient air, and thereby improved precision and method detection limits for nonpolar analytes. A model is presented that predicts gas collection efficiency and particle collection artifact for SOCs in multicapillary denuders using polydimethylsiloxane (PDMS) sorbent. An approach is presented to estimate the equilibrium PDMS-gas partition coefficient (Kpdms) from an Abraham solvation parameter model for any SOC. A high flow rate (300 L min-1) multicapillary denuder was designed for measurement of trace atmospheric SOCs. Overall method precision and detection limits were determined using field duplicates and compared to the conventional high-volume sampler method. The high-flow denuder is an alternative to high-volume or passive samplers when separation of gas and particle-associated SOCs upstream of a filter and short sample collection time are advantageous. A Lagrangian internal boundary layer transport exchange (IBLTE) Model is described. The model predicts the near-surface variation in several quantities with fetch in coastal, offshore flow: 1) modification in potential temperature and gas mixing ratio, 2) surface fluxes of sensible heat, water vapor, and trace gases using the NOAA COARE Bulk Algorithm and Gas Transfer Model, 3) vertical gradients in potential temperature and mixing ratio. The model was applied to interpret micrometeorological measurements of air-water exchange flux of HCB and several PCB congeners in Lake Superior. The IBLTE Model can be applied to any scalar, including water vapor, carbon dioxide, dimethyl sulfide, and other scalar quantities of interest with respect to hydrology, climate, and ecosystem science.