978 resultados para text message analysis and question-answering system
Resumo:
During the development of a new treatment space for the UK emergency ambulance participatory observations with front-line clinicians revealed the need for an integrated patient monitoring, communication and navigation system. The research identified the different information touch-points and requirements through modes of use analysis, day-in-the-life study and simulation workshops with clinicians. Emergency scenario and role-play with paramedics identified 5 distinct ambulance modes of use. Information flow diagrams were created and checked by paramedics and digital User Interface (UI) wireframes were developed and evaluated by clinicians during clinical evaluations. Feedback from clinicians defined UI design specification further leading to a final design proposal. This research was a further development from the 2007 EPSRC funded “Smart Pods” project. The resulting interactive prototype was co-designed in collaboration with ambulance crews and provides a vision of what could be achieved by integrating well-proven IT technologies and protocols into a package relevant in the emergency medicine field. The system has been reviewed by over 40 ambulance crews and is part of a newly co-designed ambulance treatment space.
Resumo:
One of the biggest challenges that contaminant hydrogeology is facing, is how to adequately address the uncertainty associated with model predictions. Uncertainty arise from multiple sources, such as: interpretative error, calibration accuracy, parameter sensitivity and variability. This critical issue needs to be properly addressed in order to support environmental decision-making processes. In this study, we perform Global Sensitivity Analysis (GSA) on a contaminant transport model for the assessment of hydrocarbon concentration in groundwater. We provide a quantification of the environmental impact and, given the incomplete knowledge of hydrogeological parameters, we evaluate which are the most influential, requiring greater accuracy in the calibration process. Parameters are treated as random variables and a variance-based GSA is performed in a optimized numerical Monte Carlo framework. The Sobol indices are adopted as sensitivity measures and they are computed by employing meta-models to characterize the migration process, while reducing the computational cost of the analysis. The proposed methodology allows us to: extend the number of Monte Carlo iterations, identify the influence of uncertain parameters and lead to considerable saving computational time obtaining an acceptable accuracy.
Resumo:
The work presented in my thesis addresses the two cornerstones of modern astronomy: Observation and Instrumentation. Part I deals with the observation of two nearby active galaxies, the Seyfert 2 galaxy NGC 1433 and the Seyfert 1 galaxy NGC 1566, both at a distance of $\sim10$ Mpc, which are part of the Nuclei of Galaxies (NUGA) sample. It is well established that every galaxy harbors a super massive black hole (SMBH) at its center. Furthermore, there seems to be a fundamental correlation between the stellar bulge and SMBH masses. Simulations show that massive feedback, e.g., powerful outflows, in Quasi Stellar Objects (QSOs) has an impact on the mutual growth of bulge and SMBH. Nearby galaxies follow this relation but accrete mass at much lower rates. This gives rise to the following questions: Which mechanisms allow feeding of nearby Active Galactic Nuclei (AGN)? Is this feeding triggered by events, e.g., star formation, nuclear spirals, outflows, on $\sim500$ pc scales around the AGN? Does feedback on these scales play a role in quenching the feeding process? Does it have an effect on the star formation close to the nucleus? To answer these questions I have carried out observations with the Spectrograph for INtegral Field Observation in the Near Infrared (SINFONI) at the Very Large Telescope (VLT) situated on Cerro Paranal in Chile. I have reduced and analyzed the recorded data, which contain spatial and spectral information in the H-band ($1.45 \mic-1.85 \mic$) and K-band ($1.95 \mic-2.45 \mic$) on the central $10\arcsec\times10\arcsec$ of the observed galaxies. Additionally, Atacama Large Millimeter/Sub-millimeter Array (ALMA) data at $350$ GHz ($\sim0.87$ mm) as well as optical high resolution Hubble Space Telescope (HST) images are used for the analysis. For NGC 1433 I deduce from comparison of the distributions of gas, dust, and intensity of highly ionized emission lines that the galaxy center lies $\sim70$ pc north-northwest of the prior estimate. A velocity gradient is observed at the new center, which I interpret as a bipolar outflow, a circum nuclear disk, or a combination of both. At least one dust and gas arm leads from a $r\sim200$ pc ring towards the nucleus and might feed the SMBH. Two bright warm H$_2$ gas spots are detected that indicate hidden star formation or a spiral arm-arm interaction. From the stellar velocity dispersion (SVD) I estimate a SMBH mass of $\sim1.74\times10^7$ \msol. For NGC 1566 I observe a nuclear gas disk of $\sim150$ pc in radius with a spiral structure. I estimate the total mass of this disk to be $\sim5.4\times10^7$ \msol. What mechanisms excite the gas in the disk is not clear. Neither can the existence of outflows be proven nor is star formation detected over the whole disk. On one side of the spiral structure I detect a star forming region with an estimated star formation rate of $\sim2.6\times10^{-3}$ \msol\ yr$^{-1}$. From broad Br$\gamma$ emission and SVD I estimate a mean SMBH mass of $\sim5.3\times10^6$ \msol\ with an Eddington ratio of $\sim2\times10^{-3}$. Part II deals with the final tests of the Fringe and Flexure Tracker (FFTS) for LBT INterferometric Camera and the NIR/Visible Adaptive iNterferometer for Astronomy (LINC-NIRVANA) at the Large Binocular Telescope (LBT) in Arizona, USA, which I conducted. The FFTS is the subsystem that combines the two separate beams of the LBT and enables near-infrared interferometry with a significantly large field of view. The FFTS has a cryogenic system and an ambient temperature system which are separated by the baffle system. I redesigned this baffle to guarantee the functionality of the system after the final tests in the Cologne cryostat. The redesign did not affect any scientific performance of LINC-NIRVANA. I show in the final cooldown tests that the baffle fulfills the temperature requirement and stays $<110$ K whereas the moving stages in the ambient system stay $>273$ K, which was not given for the old baffle design. Additionally, I test the tilting flexure of the whole FFTS and show that accurate positioning of the detector and the tracking during observation can be guaranteed.
Resumo:
The analysis of system calls is one method employed by anomaly detection systems to recognise malicious code execution. Similarities can be drawn between this process and the behaviour of certain cells belonging to the human immune system, and can be applied to construct an artificial immune system. A recently developed hypothesis in immunology, the Danger Theory, states that our immune system responds to the presence of intruders through sensing molecules belonging to those invaders, plus signals generated by the host indicating danger and damage. We propose the incorporation of this concept into a responsive intrusion detection system, where behavioural information of the system and running processes is combined with information regarding individual system calls.
Resumo:
Measurement of marine algal toxins has traditionally focussed on shellfish monitoring while, over the last decade, passive sampling has been introduced as a complementary tool for exploratory studies. Since 2011, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has been adopted as the EU reference method (No.15/2011) for detection and quantitation of lipophilic toxins. Traditional LC-MS approaches have been based on low-resolution mass spectrometry (LRMS), however, advances in instrument platforms have led to a heightened interest in the use of high-resolution mass spectrometry (HRMS) for toxin detection. This work describes the use of HRMS in combination with passive sampling as a progressive approach to marine algal toxin surveys. Experiments focused on comparison of LRMS and HRMS for determination of a broad range of toxins in shellfish and passive samplers. Matrix effects are an important issue to address in LC-MS; therefore, this phenomenon was evaluated for mussels (Mytilus galloprovincialis) and passive samplers using LRMS (triple quadrupole) and HRMS (quadrupole time-of-flight and Orbitrap) instruments. Matrix-matched calibration solutions containing okadaic acid and dinophysistoxins, pectenotoxin, azaspiracids, yessotoxins, domoic acid, pinnatoxins, gymnodimine A and 13-desmethyl spirolide C were prepared. Similar matrix effects were observed on all instruments types. Most notably, there was ion enhancement for pectenotoxins, okadaic acid/dinophysistoxins on one hand, and ion suppression for yessotoxins on the other. Interestingly, the ion selected for quantitation of PTX2 also influenced the magnitude of matrix effects, with the sodium adduct typically exhibiting less susceptibility to matrix effects than the ammonium adduct. As expected, mussel as a biological matrix, quantitatively produced significantly more matrix effects than passive sampler extracts, irrespective of toxin. Sample dilution was demonstrated as an effective measure to reduce matrix effects for all compounds, and was found to be particularly useful for the non-targeted approach. Limits of detection and method accuracy were comparable between the systems tested, demonstrating the applicability of HRMS as an effective tool for screening and quantitative analysis. HRMS offers the advantage of untargeted analysis, meaning that datasets can be retrospectively analysed. HRMS (full scan) chromatograms of passive samplers yielded significantly less complex data sets than mussels, and were thus more easily screened for unknowns. Consequently, we recommend the use of HRMS in combination with passive sampling for studies investigating emerging or hitherto uncharacterised toxins.
Resumo:
The analysis of system calls is one method employed by anomaly detection systems to recognise malicious code execution. Similarities can be drawn between this process and the behaviour of certain cells belonging to the human immune system, and can be applied to construct an artificial immune system. A recently developed hypothesis in immunology, the Danger Theory, states that our immune system responds to the presence of intruders through sensing molecules belonging to those invaders, plus signals generated by the host indicating danger and damage. We propose the incorporation of this concept into a responsive intrusion detection system, where behavioural information of the system and running processes is combined with information regarding individual system calls.
Resumo:
Cover title.
Resumo:
215 p.
Resumo:
Natural language processing has achieved great success in a wide range of ap- plications, producing both commercial language services and open-source language tools. However, most methods take a static or batch approach, assuming that the model has all information it needs and makes a one-time prediction. In this disser- tation, we study dynamic problems where the input comes in a sequence instead of all at once, and the output must be produced while the input is arriving. In these problems, predictions are often made based only on partial information. We see this dynamic setting in many real-time, interactive applications. These problems usually involve a trade-off between the amount of input received (cost) and the quality of the output prediction (accuracy). Therefore, the evaluation considers both objectives (e.g., plotting a Pareto curve). Our goal is to develop a formal understanding of sequential prediction and decision-making problems in natural language processing and to propose efficient solutions. Toward this end, we present meta-algorithms that take an existent batch model and produce a dynamic model to handle sequential inputs and outputs. Webuild our framework upon theories of Markov Decision Process (MDP), which allows learning to trade off competing objectives in a principled way. The main machine learning techniques we use are from imitation learning and reinforcement learning, and we advance current techniques to tackle problems arising in our settings. We evaluate our algorithm on a variety of applications, including dependency parsing, machine translation, and question answering. We show that our approach achieves a better cost-accuracy trade-off than the batch approach and heuristic-based decision- making approaches. We first propose a general framework for cost-sensitive prediction, where dif- ferent parts of the input come at different costs. We formulate a decision-making process that selects pieces of the input sequentially, and the selection is adaptive to each instance. Our approach is evaluated on both standard classification tasks and a structured prediction task (dependency parsing). We show that it achieves similar prediction quality to methods that use all input, while inducing a much smaller cost. Next, we extend the framework to problems where the input is revealed incremen- tally in a fixed order. We study two applications: simultaneous machine translation and quiz bowl (incremental text classification). We discuss challenges in this set- ting and show that adding domain knowledge eases the decision-making problem. A central theme throughout the chapters is an MDP formulation of a challenging problem with sequential input/output and trade-off decisions, accompanied by a learning algorithm that solves the MDP.
Resumo:
Means to automate the fact replace the man in their job functions for a man and machines automatic mechanism, ie documentary specialists in computer and computers are the cornerstone of any modern system of documentation and information. From this point of view immediately raises the problem of deciding what resources should be applied to solve the specific problem in each specific case. We will not let alone to propose quick fixes or recipes in order to decide what to do in any case. The solution depends on repeat for each particular problem. What we want is to move some points that can serve as a basis for reflection to help find the best solution possible, once the problem is defined correctly. The first thing to do before starting any automated system project is to define exactly the domain you want to cover and assess with greater precision possible importance.
Resumo:
Schistosomiasis is still an endemic disease in many regions, with 250 million people infected with Schistosoma and about 500,000 deaths per year. Praziquantel (PZQ) is the drug of choice for schistosomiasis treatment, however it is classified as Class II in the Biopharmaceutics Classification System, as its low solubility hinders its performance in biological systems. The use of cyclodextrins is a useful tool to increase the solubility and bioavailability of drugs. The aim of this work was to prepare an inclusion compound of PZQ and methyl-beta-cyclodextrin (MeCD), perform its physico-chemical characterization, and explore its in vitro cytotoxicity. SEM showed a change of the morphological characteristics of PZQ:MeCD crystals, and IR data supported this finding, with changes after interaction with MeCD including effects on the C-H of the aromatic ring, observed at 758 cm(-1). Differential scanning calorimetry measurements revealed that complexation occurred in a 1:1 molar ratio, as evidenced by the lack of a PZQ transition temperature after inclusion into the MeCD cavity. In solution, the PZQ UV spectrum profile in the presence of MeCD was comparable to the PZQ spectrum in a hydrophobic solvent. Phase solubility diagrams showed that there was a 5.5-fold increase in PZQ solubility, and were indicative of a type A(L) isotherm, that was used to determine an association constant (K(a)) of 140.8 M(-1). No cytotoxicity of the PZQ:MeCD inclusion compound was observed in tests using 3T3 cells. The results suggest that the association of PZQ with MeCD could be a good alternative for the treatment of schistosomiasis.
Resumo:
Background and aims: Seroclearance or seroconversion of hepatitis B surface antigen (HBsAg) is generally considered as a clinical endpoint. The purpose of the present meta-analysis was to evaluate the effect of combined therapy with pegylated interferon alpha (PEG-IFNα) with or without lamivudine (LAM) or adefovir (ADV) on HBsAg seroclearance or seroconversion in subjects with chronic hepatitis B (CHB). Methods: Randomized controlled trials performed through May 30th 2015 in adults with CHB receiving PEG-IFNα and LAM or ADV combination therapy or monotherapy for 48-52 weeks were included. The Review Manager Software 5.2.0 was used for the meta-analysis. Results: No statistical differences in HBsAg seroclearance (9.9% vs. 7.1%, OR = 1.47, 95% CI: 0.75, 2.90; p = 0.26) or HBsAg seroconversion (4.2% vs. 3.7%, OR = 1.17, 95% CI: 0.57, 2.37; p = 0.67) rates were noticed between PEG-IFNα + LAM and PEG-IFN α + placebo during post-treatment follow-up for 24-26-weeks in subjects with hepatitis Be antigen (HBeAg)-positive CHB. No statistical differences in HBsAg clearance (10.5% vs. 6.4%, OR = 1.68, 95% CI: 0.75, 3.76; p = 0.21) were seen, but statistical differences in HBsAg seroconversion (6.3% vs. 0%, OR = 7.22, 95% CI: 1.23, 42.40; p = 0.03) were observed, between PEG-IFNα + ADV and PEG-IFNα for 48-52 weeks of treatment in subjects with HBeAg-positive CHB. A systematic evaluation showed no differences in HBsAg disappearance and seroconversion rates between PEG-IFNα + placebo and PEG-IFNα + LAM for 48-52 weeks in subjects with HBeAg-positive CHB. A systematic assessment found no differences in HBsAg disappearance and seroconversion rates between PEG-IFNα + placebo and PEG-IFNα + LAM during 24 weeks' to 3 years' follow-up after treatment in subjects with HBeAg-negative CHB. Conclusion: Combined therapy with PEG-IFNα and LAM or ADV was not superior to monotherapy with PEG-IFNα in terms of HBsAg seroclearance or seroconversion.
Resumo:
The well-known degrees of freedom problem originally introduced by Nikolai Bernstein (1967) results from the high abundance of degrees of freedom in the musculoskeletal system. Such abundance in motor control have two sides: i) because it is unlikely that the Central Nervous System controls each degree of freedom independently, the complexity of the control needs to be reduced, and ii) because there are many options to perform a movement, a repetition of a given movement is never the same. It leads to two main topics in motor control and biomechanics: motor coordination and motor variability. The present thesis aimed to understand how motor systems behave and adapt under specific conditions. This thesis comprises three studies that focused on three topics of major interest in the field of sports sciences and medicine: expertise, injury risk and fatigue. The first study (expertise) has focused on the muscle coordination topic to further investigate the effect of expertise on the muscle synergistic organization, which ultimately may represent the underlying neural strategies. Studies 2 (excessive medial knee displacement) and 3 (fatigue) both aimed to better understand its impact on the dynamic local stability. The main findings of the present thesis suggest: 1) there is a great robustness in muscle synergistic organization between swimmers at different levels of expertise (study 1, chapter II), which ultimately indicate that differences in muscle coordination is mainly explained by peripheral adaptations; 2) injury risk factors such as excessive medial knee displacement (study 2, chapter III) and fatigue (study 3, chapter IV) alter the dynamic local stability of the neuromuscular system towards a more unstable state. This change in dynamic local stability represents a loss of adaptability in the neuromuscular system reducing the flexibility to adapt to a perturbation.
Resumo:
In this study, we investigated the cellular and molecular mechanisms that regulate salt acclimation. The main objective was to obtain new insights into the molecular mechanisms that control salt acclimation. Therefore, we carried out a multidisciplinary study using proteomic, transcriptomic, subcellular and physiological techniques. We obtained a Nicotiana tabacum BY-2 cell line acclimated to be grown at 258 mM NaCl as a model for this study. The proteomic and transcriptomic data indicate that the molecular response to stress (chaperones, defence proteins, etc.) is highly induced in these salt-acclimated cells. The subcellular results show that salt induces sodium compartmentalization in the cell vacuoles and seems to be mediated by vesicle trafficking in tobacco salt-acclimated cells. Our results demonstrate that abscisic acid (ABA) and proline metabolism are crucial in the cellular signalling of salt acclimation, probably regulating reactive oxygen species (ROS) production in the mitochondria. ROS may act as a retrograde signal, regulating the cell response. The network of endoplasmic reticulum and Golgi apparatus is highly altered in salt-acclimated cells. The molecular and subcellular analysis suggests that the unfolded protein response is induced in salt-acclimated cells. Finally, we propose that this mechanism may mediate cell death in salt-acclimated cells.
Resumo:
This study mainly aims to provide an inter-industry analysis through the subdivision of various industries in flow of funds (FOF) accounts. Combined with the Financial Statement Analysis data from 2004 and 2005, the Korean FOF accounts are reconstructed to form "from-whom-to-whom" basis FOF tables, which are composed of 115 institutional sectors and correspond to tables and techniques of input–output (I–O) analysis. First, power of dispersion indices are obtained by applying the I–O analysis method. Most service and IT industries, construction, and light industries in manufacturing are included in the first quadrant group, whereas heavy and chemical industries are placed in the fourth quadrant since their power indices in the asset-oriented system are comparatively smaller than those of other institutional sectors. Second, investments and savings, which are induced by the central bank, are calculated for monetary policy evaluations. Industries are bifurcated into two groups to compare their features. The first group refers to industries whose power of dispersion in the asset-oriented system is greater than 1, whereas the second group indicates that their index is less than 1. We found that the net induced investments (NII)–total liabilities ratios of the first group show levels half those of the second group since the former's induced savings are obviously greater than the latter.