948 resultados para Frontal Analysis Continuous Capillary Clectrophoresis
Resumo:
Huntington’s disease (HD) is an autosomal neurodegenerative disorder affecting approximately 5-10 persons per 100,000 worldwide. The pathophysiology of HD is not fully understood but the age of onset is known to be highly dependent on the number of CAG triplet repeats in the huntingtin gene. Using 1H NMR spectroscopy this study biochemically profiled 39 brain metabolites in post-mortem striatum (n=14) and frontal lobe (n=14) from HD sufferers and controls (n=28). Striatum metabolites were more perturbed with 15 significantly affected in HD cases, compared with only 4 in frontal lobe (P<0.05; q<0.3). The metabolite which changed most overall was urea which decreased 3.25-fold in striatum (P<0.01). Four metabolites were consistently affected in both brain regions. These included the neurotransmitter precursors tyrosine and L-phenylalanine which were significantly depleted by 1.55-1.58-fold and 1.48-1.54-fold in striatum and frontal lobe, respectively (P=0.02-0.03). They also included L-leucine which was reduced 1.54-1.69-fold (P=0.04-0.09) and myo-inositol which was increased 1.26-1.37-fold (P<0.01). Logistic regression analyses performed with MetaboAnalyst demonstrated that data obtained from striatum produced models which were profoundly more sensitive and specific than those produced from frontal lobe. The brain metabolite changes uncovered in this first 1H NMR investigation of human HD offer new insights into the disease pathophysiology. Further investigations of striatal metabolite disturbances are clearly warranted.
Resumo:
Background: There is an urgent need to identify molecular signatures in small cell lung cancer (SCLC) that may select patients who are likely to respond to molecularly targeted therapies. In this study, we investigate the feasibility of undertaking focused molecular analyses on routine diagnostic biopsies in patients with SCLC.
Methods: A series of histopathologically confirmed formalin-fixed, paraffin-embedded SCLC specimens were analysed for epidermal growth factor receptors (EGFR), KRAS, NRAS and BRAF mutations, ALK gene rearrangements and MET amplification. EGFR and KRAS mutation testing was evaluated using real time polymerase chain reaction (RT-PCR cobas®), BRAF and NRAS mutations using multiplex PCR and capillary electrophoresis-single strand conformation analysis, and ALK and MET aberrations with fluorescent in situ hybridization. All genetic aberrations detected were validated independently.
Results: A total of 105 patients diagnosed with SCLC between July 1990 and September 2006 were included. 60 (57 %) patients had suitable tumour tissue for molecular testing. 25 patients were successfully evaluated for all six pre-defined molecular aberrations. Eleven patients failed all molecular analysis. No mutations in EGFR, KRAS and NRAS were detected, and no ALK gene rearrangements or MET gene amplifications were identified. A V600E substitution in BRAF was detected in a Caucasian male smoker diagnosed with SCLC with squamoid and glandular features.
Conclusion: The paucity of patients with sufficient tumour tissue, quality of DNA extracted and low frequency of aberrations detected indicate that alternative molecular characterisation approaches are necessary, such as the use of circulating plasma DNA in patients with SCLC.
Resumo:
Companies face new challenges almost every day. In order to stay competitive, it is important that companies strive for continuous development and improvement. By describing companies through their processes it is possible to get a clear overview of the entire operation, which can contribute, to a well-established overall understanding of the company. This is a case study based on Stort AB which is a small logistics company specialized in international transportation and logistics solutions. The purpose of this study is to perform value stream mapping in order to create a more efficient production process and propose possible improvements in order to reduce processing time. After performing value stream mapping, data envelopment analysis is used to calculate how lean Stort AB is today and how lean the company can become by implementing the proposed improvements. The results show that the production process can improve efficiency by minimizing waste produced by a bad workplace layout and over-processing. The authors suggested solution is to introduce standardized processes and invest in technical instruments in order to automate the process to reduce process time. According to data envelopment analysis the business is 41 percent lean at present and may soon become 55 percent lean and finally reach an optimum 100 percent lean mode if the process is automated.
Resumo:
Communication can be seen as one of the most important features to manage conflicts and the stress of the work teams that operate in environments with strong pressure, complex operations and continuous risk, which are aspects that characterize a high reliability organization. This article aims to highlight the importance of communication in high-reliability organizations, having as object of study the accidents and incidents in civil aviation area. It refers to a qualitative research, outlined by documental analysis based on investigations conducted by the Federal Aviation Administration and the Center of Investigation and Prevention of Aeronautical Accidents. The results point out that human errors account for 60 to 80 percent of accidents and incidents. Most of these occurrences are attributed to miscommunication between the professionals involved with the air and ground operation, such as pilots, crewmembers and maintenance staff, and flight controllers. Inappropriate tone of voice usage, difficulties to understand different accents between the issuer and the receiver or even difficulty to perceive red flags between the lines of verbal and non-verbal communication, are elements that contribute to the fata of understanding between people involved in the operation. As a research limitation this present research pointed out a lack of a special category of "interpersonal communications failures" in the official agency reports. So, the researchers must take the conceptual definition of "social ability", communication implied, to classify behaviors and communication matters accordingly. Other research finding indicates that communication is superficially approached in the contents of air operations courses what could mitigate the lack of communications skills as a social ability. Part of the research findings refers to the contents of communication skills development into the program to train professional involved in air flight and ground operations. So, it is expected that this present article gives an appropriate highlight towards the improvement of flight operations training programs. Developing communication skills among work teams in high reliability organizations can contribute to mitigate stress, accidents and incidents in Civil Aviation Field. The original contribution of this article is the proposal of the main contents that should be developed in a Communication Skills Training Program, specially addressed to Civil Aviation operations.
Resumo:
A high-resolution geochemical record of a 120 cm black shale interval deposited during the Coniacian-Santonian Oceanic Anoxic Event 3 (ODP Leg 207, Site 1261, Demerara Rise) has been constructed to provide detailed insight into rapid changes in deep ocean and sediment paleo-redox conditions. High contents of organic matter, sulfur and redox-sensitive trace metals (Cd, Mo, V, Zn), as well as continuous lamination, point to deposition under consistently oxygen-free and largely sulfidic bottom water conditions. However, rapid and cyclic changes in deep ocean redox are documented by short-term (~15-20 ka) intervals with decreased total organic carbon (TOC), S and redox-sensitive trace metal contents, and in particular pronounced phosphorus peaks (up to 2.5 wt% P) associated with elevated Fe oxide contents. Sequential iron and phosphate extractions confirm that P is dominantly bound to iron oxides and incorporated into authigenic apatite. Preservation of this Fe-P coupling in an otherwise sulfidic depositional environment (as indicated by Fe speciation and high amounts of sulfurized organic matter) may be unexpected, and provides evidence for temporarily non-sulfidic bottom waters. However, there is no evidence for deposition under oxic conditions. Instead, sulfidic conditions were punctuated by periods of anoxic, non-sulfidic bottom waters. During these periods, phosphate was effectively scavenged during precipitation of iron (oxyhydr)oxides in the upper water column, and was subsequently deposited and largely preserved at the sea floor. After ~15-25 ka, sulfidic bottom water conditions were re-established, leading to the initial precipitation of CdS, ZnS and pyrite. Subsequently, increasing concentrations of H2S in the water column led to extensive formation of sulfurized organic matter, which effectively scavenged particle-reactive Mo complexes (thiomolybdates). At Site 1261, sulfidic bottom waters lasted for ?90-100 ka, followed by another period of anoxic, non-sulfidic conditions lasting for ~15-20 ka. The observed cyclicity at the lower end of the redox scale may have been triggered by repeated incursions of more oxygenated surface- to mid-waters from the South Atlantic resulting in a lowering of the oxic-anoxic chemocline in the water column. Alternatively, sea water sulfate might have been stripped by long-lasting high rates of sulfate reduction, removing the ultimate source for HS**- production.
Resumo:
In design or safety assessment of mechanical structures, the use of the Design by Analysis (DBA) route is a modern trend. However, for making possible to apply DBA to structures under variable loads, two basic failure modes considered by ASME or European Standards must be precluded. Those modes are the alternate plasticity and incremental collapse (with instantaneous plastic collapse as a particular case). Shakedown theory is a tool that permit us to assure that those kinds of failures will be avoided. However, in practical applications, very large nonlinear optimization problems are generated. Due to this facts, only in recent years have been possible to obtain algorithms sufficiently accurate, robust and efficient, for dealing with this class of problems. In this paper, one of these shakedown algorithms, developed for dealing with elastic ideally-plastic structures, is enhanced to include limited kinematic hardening, a more realistic material behavior. This is done in the continuous model by using internal thermodynamic variables. A corresponding discrete model is obtained using an axisymmetric mixed finite element with an internal variable. A thick wall sphere, under variable thermal and pressure loads, is used in an example to show the importance of considering the limited kinematic hardening in the shakedown calculations
Resumo:
The Frontal Assessment Batery / A Bateria de Avaliação Frontal (FAB) é um teste neuropsicológico, constituído por seis subtestes, cujo objetivo é avaliar a disfunção executiva global, nomeadamente as funções relacionadas com o lobo frontal, tais como a concetualização, flexibilidade mental, programação motora, sensibilidade à interferência, controlo inibitório e autonomia ambiental frontal. De forma a contribuir para o avanço dos estudos normativos em Portugal, esta dissertação tem como objetivo avaliar as propriedades psicométricas da FAB, numa amostra de adultos da população portuguesa. O protocolo abrangeu a seguinte bateria de testes neuropsicológicos: Bateria de Avaliação Frontal, Figura Complexa de Rey, Matrizes Progressivas de Raven e Teste do Desenho do Relógio. A amostra deste estudo incluiu 376 indivíduos, 155 do sexo masculino e 221 do sexo feminino. Os resultados desta investigação sugerem que a pontuação da FAB é influenciada por algumas variáveis sociodemográficas, designadamente a idade, escolaridade, profissões e região. A análise correlacional mostrou que há apenas uma correlação positiva moderada entre a FAB e as Matrizes Progressivas de Raven. Apesar da consistência interna da FAB ser baixa, existe uma estabilidade temporal moderada. Ao finalizar, consideramos que a FAB reúne os requisitos para se apresentar como uma bateria útil e eficaz, demonstrando um grau razoável de estabilidade temporal, mas fraca consistência interna, sugerindo que a FAB não é indicada para amostra não clínica. / The Frontal Assessment Baterry (FAB) is a neuropsychological test, composed of six subtests, whose aim is to assess the overall executive dysfunction, namely functions related to the frontal lobe, such as conceptualization, mental flexibility, motor programming, sensitivity to interference, inhibitory control and environmental autonomy. In order to contribute to the advancement of normative studies in Portugal, this dissertation aim to evaluate the psychometric properties of the FAB, in an adult sample of the portuguese population. The protocol included the following battery of neuropsychological tests: Frontal Assessment Battery, Complex Figure of Rey, Raven's Progressive Matrices and Clock Drawing Test. The sample this study included 376 individuals, 155 male and 221 female. The results of this investigation suggest that FAB is influenced by some sociodemographic variables, namely age, education, profession and region. The correlational analysis showed that there is only a moderate positive correlation between the FAB and the Raven Progressive Matrices. However, also they found low positive correlations between the FAB and the Complex Figure of Rey, and Clock Drawing Test. Although the FAB has a low internal consistency, there is a moderate temporal stability. Finally, we consider that the FAB gathers the requirements to present itself as a useful and effective battery, demonstrating a reasonable degree of temporal stability, but weaker internal consistency, suggesting that the FAB is not indicate for non-clinical sample.
Resumo:
RESUMO Objetivos: O Acidente Vascular Cerebral (AVC) potencia o desenvolvimento de disfunção executiva, conduzindo a défice no desempenho das tarefas do quotidiano. A avaliação neuropsicológica das funções executivas é importante para desenvolver estratégias de reabilitação adequadas. Assim, são objetivos descrever os dados normativos, precisão de diagnóstico, propriedades psicométricas e análise fatorial da Bateria de Avaliação Frontal (FAB), instrumento breve e de rápida administração, numa amostra de idosos com AVC. Métodos: Inserida no projeto Trajetórias do Envelhecimento de Idosos em Resposta Social, esta investigação conta com uma amostra de 112 pessoas idosas com diagnóstico médico de AVC e 157 pessoas idosas de um subgrupo de controlo sem AVC. Os sujeitos apresentam idades compreendidas entre os 60 e os 100 anos (M = 78,20; DP = 7,57) sendo maioritariamente do sexo feminino (n = 194). A avaliação inclui entrevistas e testes neuropsicológicos agrupados em medidas de funcionamento executivo, medidas cognitivas de referência e medidas clínicas de controlo. Resultados: As variáveis idade e escolaridade interferiram nas pontuações obtidas na amostra clínica, não sendo verificado impacto da variável sexo. Para um ponto de corte de 7, a FAB teve uma sensibilidade de 83,4% e especificidade de 66,1 % (AUC = 0,64); revelou um alfa de Cronbach de 0,79 e correlações fortes com os testes executivos (teste de Stroop, Figura Complexa de Rey, fator Atencional-Executivo do Montreal Cognitive Assessment e Alternância nos testes de Fluência verbal). A análise fatorial confirmatória apontou uma estrutura com um fator. Conclusões: A FAB apresenta boa consistência interna, validade convergente e validade de constructo, aparentando ser uma escala útil para avaliar o défice executivo em pessoas idosas com AVC. Dadas algumas limitações do estudo, que poderão explicar a fraca precisão diagnóstica da FAB, são incentivadas investigações futuras pois a FAB revelou-se um instrumento com propriedades psicométricas promissoras. ABSTRACT Goals: Stroke potentiates the development of executive dysfunction, leading to impairment in performance of daily activities. The neuropsychological assessment of executive functions is important to develop adequate rehabilitation strategies. Thus, describing the normative data, diagnostic accuracy, psychometric properties and factor analysis of the Frontal Assessment Battery (FAB), a brief and easy to administer instrument, in a clinical sample with stroke are objectives of this study. Methods: Being part of the Aging Trajectories of Institutionalized Elderly, this research has a sample 112 elderly people with a medical diagnosis of stroke and a control subgroup of 157 elderly people. The subjects have ages between 60 and 100 years old (M = 78.20, SD = 7.57), mostly females (n = 194). The measurements used include interviews and neuropsychological tests grouped in executive functioning measures, cognitive measures of reference and clinical measures of control. Results: The variables age and education affect the scores obtained in the clinical subgroup, having the variable gender no impact on these. Using a cutoff score of 7, the FAB had a sensitivity of 83.4% and a specificity of 66.1% for screening stroke (AUC = 0.64); showed a Cronbach's α of 0.79, and strong correlations with executive tests (Stroop test, Rey Complex Figure, Attentional-Executive factor of Montreal Cognitive Assessment and Switching in the verbal fluency tests). The confirmatory factor analysis supported a one-factor structure. Conclusions: The FAB presents good internal consistency, convergent, and construct when used for elderly with stroke. Due to some limitations of the study, which may explain the weak discriminant validity, further investigations are encouraged because FAB has showed promising psychometric properties.
Resumo:
Continuous delivery (CD) is a software engineering approach where the focus lays on creating a short delivery cycle by automating parts of the deployment pipeline which includes build, deploy-, test and release process. CD is based on that during development should be possible to always automatically generate a release based on the source code in its current state. One of CD's many advantages is that through continuous releases it allows you to get a quick feedback loop leading to faster and more efficient implementation of new functions, at the same time fixing errors. Although CD has many advantages, there are also several challenges a maintenance management project must manage in the transition to CD. These challenges may differ depending on the maturity level for a maintenance management project and what strengths and weaknesses the project has. Our research question was: "What challenges can a maintenance management project face in transition to Continuous delivery?" The purpose of this study is to describe Continuous delivery and the challenges a maintenance management project may face during a transition to Continuous delivery. A descriptive case study has been carried out with the data collection methods of interviews and documents. A situation analysis was created based on the collected data in a shape of a process model that represent the maintenance management projects release process. The processmodel was used as the basis of SWOT analysis and analysis by Rehn et al's Maturity Model. From these analyzes we found challenges of a maintenance management project may face in the transition to CD. The challenges are about customers and the management's attitude towards a transition to CD. But the biggest challenge is about automation of the deployment pipeline steps.
Resumo:
Failure analysis has been, throughout the years, a fundamental tool used in the aerospace sector, supporting assessments performed by sustainment and design engineers mainly related to failure modes and material suitability. The predicted service life of aircrafts often exceeds 40 years, and the design assured life rarely accounts for all in service loads and in service environmental menaces that aging aircrafts must deal with throughout their service lives. From the most conservative safe-life conceptual design approaches to the most recent on-condition based design approaches, assessing the condition and predicting the failure modes of components and materials are essential for the development of adequate preventive and corrective maintenance actions as well as for the accomplishment and optimization of scheduled maintenance programs of aircrafts. Moreover, as the operational conditions of aircrafts may vary significantly from operator to operator (especially in military aircraft), it is necessary to access if the defined maintenance programs are adequate to guarantee the continuous reliability and safe usage of the aircrafts, preventing catastrophic failures which bear significant maintenance and repair costs, and that may lead to the loss of human lives. Thus being, failure analysis and material investigations performed as part of aircraft accidents and incidents investigations arise as powerful tools of the utmost importance for safety assurance and cost reduction within the aeronautical and aerospace sectors. The Portuguese Air Force (PRTAF) has operated different aircrafts throughout its long existence, and in some cases, has operated a particular type of aircraft for more than 30 years, gathering a great amount of expertise in: assessing failure modes of the aircrafts materials; conducting aircrafts accidents and incidents investigations (sometimes with the participation of the aircraft manufacturers and/or other operators); and in the development of design and repair solutions for in-service related problems. This paper addresses several studies to support the thesis that failure analysis plays a key role in flight safety improvement within the PRTAF. It presents a short summary of developed
Resumo:
With the world of professional sports shifting towards employing better sport analytics, the demand for vision-based performance analysis is growing increasingly in recent years. In addition, the nature of many sports does not allow the use of any kind of sensors or other wearable markers attached to players for monitoring their performances during competitions. This provides a potential application of systematic observations such as tracking information of the players to help coaches to develop their visual skills and perceptual awareness needed to make decisions about team strategy or training plans. My PhD project is part of a bigger ongoing project between sport scientists and computer scientists involving also industry partners and sports organisations. The overall idea is to investigate the contribution technology can make to the analysis of sports performance on the example of team sports such as rugby, football or hockey. A particular focus is on vision-based tracking, so that information about the location and dynamics of the players can be gained without any additional sensors on the players. To start with, prior approaches on visual tracking are extensively reviewed and analysed. In this thesis, methods to deal with the difficulties in visual tracking to handle the target appearance changes caused by intrinsic (e.g. pose variation) and extrinsic factors, such as occlusion, are proposed. This analysis highlights the importance of the proposed visual tracking algorithms, which reflect these challenges and suggest robust and accurate frameworks to estimate the target state in a complex tracking scenario such as a sports scene, thereby facilitating the tracking process. Next, a framework for continuously tracking multiple targets is proposed. Compared to single target tracking, multi-target tracking such as tracking the players on a sports field, poses additional difficulties, namely data association, which needs to be addressed. Here, the aim is to locate all targets of interest, inferring their trajectories and deciding which observation corresponds to which target trajectory is. In this thesis, an efficient framework is proposed to handle this particular problem, especially in sport scenes, where the players of the same team tend to look similar and exhibit complex interactions and unpredictable movements resulting in matching ambiguity between the players. The presented approach is also evaluated on different sports datasets and shows promising results. Finally, information from the proposed tracking system is utilised as the basic input for further higher level performance analysis such as tactics and team formations, which can help coaches to design a better training plan. Due to the continuous nature of many team sports (e.g. soccer, hockey), it is not straightforward to infer the high-level team behaviours, such as players’ interaction. The proposed framework relies on two distinct levels of performance analysis: low-level performance analysis, such as identifying players positions on the play field, as well as a high-level analysis, where the aim is to estimate the density of player locations or detecting their possible interaction group. The related experiments show the proposed approach can effectively explore this high-level information, which has many potential applications.
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
The current energy market requires urgent revision for the introduction of renewable, less-polluting and inexpensive energy sources. Biohydrogen (bioH2) is considered to be one of the most appropriate options for this model shift, being easily produced through the anaerobic fermentation of carbohydrate-containing biomass. Ideally, the feedstock should be low-cost, widely available and convertible into a product of interest. Microalgae are considered to possess the referred properties, being also highly valued for their capability to assimilate CO2 [1]. The microalga Spirogyra sp. is able to accumulate high concentrations of intracellular starch, a preferential carbon source for some bioH2 producing bacteria such as Clostridium butyricum [2]. In the present work, Spirogyra biomass was submitted to acid hydrolysis to degrade polymeric components and increase the biomass fermentability. Initial tests of bioH2 production in 120 mL reactors with C. butyricum yielded a maximum volumetric productivity of 141 mL H2/L.h and a H2 production yield of 3.78 mol H2/mol consumed sugars. Subsequently, a sequential batch reactor (SBR) was used for the continuous H2 production from Spirogyra hydrolysate. After 3 consecutive batches, the fermentation achieved a maximum volumetric productivity of 324 mL H2/L.h, higher than most results obtained in similar production systems [3] and a potential H2 production yield of 10.4 L H2/L hydrolysate per day. The H2 yield achieved in the SBR was 2.59 mol H2/mol, a value that is comparable to those attained with several thermophilic microorganisms [3], [4]. In the present work, a detailed energy consumption of the microalgae value-chain is presented and compared with previous results from the literature. The specific energy requirements were determined and the functional unit considered was gH2 and MJH2. It was possible to identify the process stages responsible for the highest energy consumption during bioH2 production from Spirogyra biomass for further optimisation.
Resumo:
Evaluation of the quality of the environment is essential for human wellness as pollutants in trace amounts can cause serious health problem. Nitrosamines are a group of compounds that are considered potential carcinogens and can be found in drinking water (as disinfection byproducts), foods, beverages and cosmetics. To monitor the level of these compounds to minimize daily intakes, fast and reliable analytical techniques are required. As these compounds are relatively highly polar, extraction and enrichment from environmental samples (aqueous) are challenging. Also, the trend of analytical techniques toward the reduction of sample size and minimization of organic solvent use demands new methods of analysis. In light of fulfilling these requirements, a new method of online preconcentration tailored to an electrokinetic chromatography is introduced. In this method, electroosmotic flow (EOF) was suppressed to increase the interaction time between analyte and micellar phase, therefore the only force to mobilize the neutral analytes is the interaction of analyte with moving micelles. In absence of EOF, polarity of applied potential was switched (negative or positive) to force (anionic or cationic) micelles to move toward the detector. To avoid the excessive band broadening due to longer analysis time caused by slow moving micelles, auxiliary pressure was introduced to boost the micelle movement toward the detector using an in house designed and built apparatus. Applying the external auxiliary pressure significantly reduced the analysis times without compromising separation efficiency. Parameters, such as type of surfactants, composition of background electrolyte (BGE), type of capillary, matrix effect, organic modifiers, etc., were evaluated in optimization of the method. The enrichment factors for targeted analytes were impressive, particularly; cationic surfactants were shown to be suitable for analysis of nitrosamines due to their ability to act as hydrogen bond donors. Ammonium perfluorooctanoate (APFO) also showed remarkable results in term of peak shapes and number of theoretical plates. It was shown that the separation results were best when a high conductivity sample was paired with a BGE of lower conductivity. Using higher surfactant concentrations (up to 200 mM SDS) than usual (50 mM SDS) for micellar electrokinetic chromatography (MEKC) improved the sweeping. A new method for micro-extraction and enrichment of highly polar neutral analytes (N-Nitrosamines in particular) based on three-phase drop micro-extraction was introduced and its performance studied. In this method, a new device using some easy-to-find components was fabricated and its operation and application demonstrated. Compared to conventional extraction methods (liquid-liquid extraction), consumption of organic solvents and operation times were significantly lower.
Resumo:
This dissertation investigates the connection between spectral analysis and frame theory. When considering the spectral properties of a frame, we present a few novel results relating to the spectral decomposition. We first show that scalable frames have the property that the inner product of the scaling coefficients and the eigenvectors must equal the inverse eigenvalues. From this, we prove a similar result when an approximate scaling is obtained. We then focus on the optimization problems inherent to the scalable frames by first showing that there is an equivalence between scaling a frame and optimization problems with a non-restrictive objective function. Various objective functions are considered, and an analysis of the solution type is presented. For linear objectives, we can encourage sparse scalings, and with barrier objective functions, we force dense solutions. We further consider frames in high dimensions, and derive various solution techniques. From here, we restrict ourselves to various frame classes, to add more specificity to the results. Using frames generated from distributions allows for the placement of probabilistic bounds on scalability. For discrete distributions (Bernoulli and Rademacher), we bound the probability of encountering an ONB, and for continuous symmetric distributions (Uniform and Gaussian), we show that symmetry is retained in the transformed domain. We also prove several hyperplane-separation results. With the theory developed, we discuss graph applications of the scalability framework. We make a connection with graph conditioning, and show the in-feasibility of the problem in the general case. After a modification, we show that any complete graph can be conditioned. We then present a modification of standard PCA (robust PCA) developed by Cand\`es, and give some background into Electron Energy-Loss Spectroscopy (EELS). We design a novel scheme for the processing of EELS through robust PCA and least-squares regression, and test this scheme on biological samples. Finally, we take the idea of robust PCA and apply the technique of kernel PCA to perform robust manifold learning. We derive the problem and present an algorithm for its solution. There is also discussion of the differences with RPCA that make theoretical guarantees difficult.