956 resultados para Question-answering systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

While some of the deepest results in nature are those that give explicit bounds between important physical quantities, some of the most intriguing and celebrated of such bounds come from fields where there is still a great deal of disagreement and confusion regarding even the most fundamental aspects of the theories. For example, in quantum mechanics, there is still no complete consensus as to whether the limitations associated with Heisenberg's Uncertainty Principle derive from an inherent randomness in physics, or rather from limitations in the measurement process itself, resulting from phenomena like back action. Likewise, the second law of thermodynamics makes a statement regarding the increase in entropy of closed systems, yet the theory itself has neither a universally-accepted definition of equilibrium, nor an adequate explanation of how a system with underlying microscopically Hamiltonian dynamics (reversible) settles into a fixed distribution.

Motivated by these physical theories, and perhaps their inconsistencies, in this thesis we use dynamical systems theory to investigate how the very simplest of systems, even with no physical constraints, are characterized by bounds that give limits to the ability to make measurements on them. Using an existing interpretation, we start by examining how dissipative systems can be viewed as high-dimensional lossless systems, and how taking this view necessarily implies the existence of a noise process that results from the uncertainty in the initial system state. This fluctuation-dissipation result plays a central role in a measurement model that we examine, in particular describing how noise is inevitably injected into a system during a measurement, noise that can be viewed as originating either from the randomness of the many degrees of freedom of the measurement device, or of the environment. This noise constitutes one component of measurement back action, and ultimately imposes limits on measurement uncertainty. Depending on the assumptions we make about active devices, and their limitations, this back action can be offset to varying degrees via control. It turns out that using active devices to reduce measurement back action leads to estimation problems that have non-zero uncertainty lower bounds, the most interesting of which arise when the observed system is lossless. One such lower bound, a main contribution of this work, can be viewed as a classical version of a Heisenberg uncertainty relation between the system's position and momentum. We finally also revisit the murky question of how macroscopic dissipation appears from lossless dynamics, and propose alternative approaches for framing the question using existing systematic methods of model reduction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents investigations in four areas of theoretical astrophysics: the production of sterile neutrino dark matter in the early Universe, the evolution of small-scale baryon perturbations during the epoch of cosmological recombination, the effect of primordial magnetic fields on the redshifted 21-cm emission from the pre-reionization era, and the nonlinear stability of tidally deformed neutron stars.

In the first part of the thesis, we study the asymmetry-driven resonant production of 7 keV-scale sterile neutrino dark matter in the primordial Universe at temperatures T >~ 100 MeV. We report final DM phase space densities that are robust to uncertainties in the nature of the quark-hadron transition. We give transfer functions for cosmological density fluctuations that are useful for N-body simulations. We also provide a public code for the production calculation.

In the second part of the thesis, we study the instability of small-scale baryon pressure sound waves during cosmological recombination. We show that for relevant wavenumbers, inhomogenous recombination is driven by the transport of ionizing continuum and Lyman-alpha photons. We find a maximum growth factor less than ≈ 1.2 in 107 random realizations of initial conditions. The low growth factors are due to the relatively short duration of the recombination epoch.

In the third part of the thesis, we propose a method of measuring weak magnetic fields, of order 10-19 G (or 10-21 G if scaled to the present day), with large coherence lengths in the inter galactic medium prior to and during the epoch of cosmic reionization. The method utilizes the Larmor precession of spin-polarized neutral hydrogen in the triplet state of the hyperfine transition. We perform detailed calculations of the microphysics behind this effect, and take into account all the processes that affect the hyperfine transition, including radiative decays, collisions, and optical pumping by Lyman-alpha photons.

In the final part of the thesis, we study the non-linear effects of tidal deformations of neutron stars (NS) in a compact binary. We compute the largest three- and four-mode couplings among the tidal mode and high-order p- and g-modes of similar radial wavenumber. We demonstrate the near-exact cancellation of their effects, and resolve the question of the stability of the tidally deformed NS to leading order. This result is significant for the extraction of binary parameters from gravitational wave observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bistable dynamical switches are frequently encountered in mathematical modeling of biological systems because binary decisions are at the core of many cellular processes. Bistable switches present two stable steady-states, each of them corresponding to a distinct decision. In response to a transient signal, the system can flip back and forth between these two stable steady-states, switching between both decisions. Understanding which parameters and states affect this switch between stable states may shed light on the mechanisms underlying the decision-making process. Yet, answering such a question involves analyzing the global dynamical (i.e., transient) behavior of a nonlinear, possibly high dimensional model. In this paper, we show how a local analysis at a particular equilibrium point of bistable systems is highly relevant to understand the global properties of the switching system. The local analysis is performed at the saddle point, an often disregarded equilibrium point of bistable models but which is shown to be a key ruler of the decision-making process. Results are illustrated on three previously published models of biological switches: two models of apoptosis, the programmed cell death and one model of long-term potentiation, a phenomenon underlying synaptic plasticity. © 2012 Trotta et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cooper, J., Spink, S., Thomas, R. & Urquhart, C. (2005). Evaluation of the Specialist Libraries/Communities of Practice. Report for National Library for Health. Aberystwyth: Department of Information Studies, University of Wales Aberystwyth. Sponsorship: National Library for Health (NLH)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the first part of this paper we reviewed the fingerprint classification literature from two different perspectives: the feature extraction and the classifier learning. Aiming at answering the question of which among the reviewed methods would perform better in a real implementation we end up in a discussion which showed the difficulty in answering this question. No previous comparison exists in the literature and comparisons among papers are done with different experimental frameworks. Moreover, the difficulty in implementing published methods was stated due to the lack of details in their description, parameters and the fact that no source code is shared. For this reason, in this paper we will go through a deep experimental study following the proposed double perspective. In order to do so, we have carefully implemented some of the most relevant feature extraction methods according to the explanations found in the corresponding papers and we have tested their performance with different classifiers, including those specific proposals made by the authors. Our aim is to develop an objective experimental study in a common framework, which has not been done before and which can serve as a baseline for future works on the topic. This way, we will not only test their quality, but their reusability by other researchers and will be able to indicate which proposals could be considered for future developments. Furthermore, we will show that combining different feature extraction models in an ensemble can lead to a superior performance, significantly increasing the results obtained by individual models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

— Consideration of how people respond to the question What is this? has suggested new problem frontiers for pattern recognition and information fusion, as well as neural systems that embody the cognitive transformation of declarative information into relational knowledge. In contrast to traditional classification methods, which aim to find the single correct label for each exemplar (This is a car), the new approach discovers rules that embody coherent relationships among labels which would otherwise appear contradictory to a learning system (This is a car, that is a vehicle, over there is a sedan). This talk will describe how an individual who experiences exemplars in real time, with each exemplar trained on at most one category label, can autonomously discover a hierarchy of cognitive rules, thereby converting local information into global knowledge. Computational examples are based on the observation that sensors working at different times, locations, and spatial scales, and experts with different goals, languages, and situations, may produce apparently inconsistent image labels, which are reconciled by implicit underlying relationships that the network’s learning process discovers. The ARTMAP information fusion system can, moreover, integrate multiple separate knowledge hierarchies, by fusing independent domains into a unified structure. In the process, the system discovers cross-domain rules, inferring multilevel relationships among groups of output classes, without any supervised labeling of these relationships. In order to self-organize its expert system, the ARTMAP information fusion network features distributed code representations which exploit the model’s intrinsic capacity for one-to-many learning (This is a car and a vehicle and a sedan) as well as many-to-one learning (Each of those vehicles is a car). Fusion system software, testbed datasets, and articles are available from http://cns.bu.edu/techlab.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded wireless sensor network (WSN) systems have been developed and used in a wide variety of applications such as local automatic environmental monitoring; medical applications analysing aspects of fitness and health energy metering and management in the built environment as well as traffic pattern analysis and control applications. While the purpose and functions of embedded wireless sensor networks have a myriad of applications and possibilities in the future, a particular implementation of these ambient sensors is in the area of wearable electronics incorporated into body area networks and everyday garments. Some of these systems will incorporate inertial sensing devices and other physical and physiological sensors with a particular focus on the application areas of athlete performance monitoring and e-health. Some of the important physical requirements for wearable antennas are that they are light-weight, small and robust and should also use materials that are compatible with a standard manufacturing process such as flexible polyimide or fr4 material where low cost consumer market oriented products are being produced. The substrate material is required to be low loss and flexible and often necessitates the use of thin dielectric and metallization layers. This paper describes the development of such a wearable, flexible antenna system for ISM band wearable wireless sensor networks. The material selected for the development of the wearable system in question is DE104i characterized by a dielectric constant of 3.8 and a loss tangent of 0.02. The antenna feed line is a 50 Ohm microstrip topology suitable for use with standard, high-performance and low-cost SMA-type RF connector technologies, widely used for these types of applications. The desired centre frequency is aimed at the 2.4GHz ISM band to be compatible with IEEE 802.15.4 Zigbee communication protocols and the Bluetooth standard which operate in this band.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern information systems (ISs) are becoming increasingly complex. Simultaneously, organizational changes are occurring more often and more rapidly. Therefore, emergent behavior and organic adaptivity are key advantages of ISs. In this paper, a design science research (DSR) question for design-oriented information systems research (DISR) is proposed: Can the application of biomimetic principles to IS design result in the creation of value by innovation? Accordingly, the properties of biological IS are analyzed, and these insights are crystallized into a theoretical framework to address the three major aspects of biomimetic ISs: user experience, information processing, and management cybernetics. On this basis, the research question is elaborated together with a starting point for a research methodology in biomimetic information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The growth and proliferation of invasive bacteria in engineered systems is an ongoing problem. While there are a variety of physical and chemical processes to remove and inactivate bacterial pathogens, there are many situations in which these tools are no longer effective or appropriate for the treatment of a microbial target. For example, certain strains of bacteria are becoming resistant to commonly used disinfectants, such as chlorine and UV. Additionally, the overuse of antibiotics has contributed to the spread of antibiotic resistance, and there is concern that wastewater treatment processes are contributing to the spread of antibiotic resistant bacteria.

Due to the continually evolving nature of bacteria, it is difficult to develop methods for universal bacterial control in a wide range of engineered systems, as many of our treatment processes are static in nature. Still, invasive bacteria are present in many natural and engineered systems, where the application of broad acting disinfectants is impractical, because their use may inhibit the original desired bioprocesses. Therefore, to better control the growth of treatment resistant bacteria and to address limitations with the current disinfection processes, novel tools that are both specific and adaptable need to be developed and characterized.

In this dissertation, two possible biological disinfection processes were investigated for use in controlling invasive bacteria in engineered systems. First, antisense gene silencing, which is the specific use of oligonucleotides to silence gene expression, was investigated. This work was followed by the investigation of bacteriophages (phages), which are viruses that are specific to bacteria, in engineered systems.


For the antisense gene silencing work, a computational approach was used to quantify the number of off-targets and to determine the effects of off-targets in prokaryotic organisms. For the organisms of Escherichia coli K-12 MG1655 and Mycobacterium tuberculosis H37Rv the mean number of off-targets was found to be 15.0 + 13.2 and 38.2 + 61.4, respectively, which results in a reduction of greater than 90% of the effective oligonucleotide concentration. It was also demonstrated that there was a high variability in the number of off-targets over the length of a gene, but that on average, there was no general gene location that could be targeted to reduce off-targets. Therefore, this analysis needs to be performed for each gene in question. It was also demonstrated that the thermodynamic binding energy between the oligonucleotide and the mRNA accounted for 83% of the variation in the silencing efficiency, compared to the number of off-targets, which explained 43% of the variance of the silencing efficiency. This suggests that optimizing thermodynamic parameters must be prioritized over minimizing the number of off-targets. In conclusion for the antisense work, these results suggest that off-target hybrids can account for a greater than 90% reduction in the concentration of the silencing oligonucleotides, and that the effective concentration can be increased through the rational design of silencing targets by minimizing off-target hybrids.

Regarding the work with phages, the disinfection rates of bacteria in the presence of phages was determined. The disinfection rates of E. coli K12 MG1655 in the presence of coliphage Ec2 ranged up to 2 h-1, and were dependent on both the initial phage and bacterial concentrations. Increasing initial phage concentrations resulted in increasing disinfection rates, and generally, increasing initial bacterial concentrations resulted in increasing disinfection rates. However, disinfection rates were found to plateau at higher bacterial and phage concentrations. A multiple linear regression model was used to predict the disinfection rates as a function of the initial phage and bacterial concentrations, and this model was able to explain 93% of the variance in the disinfection rates. The disinfection rates were also modeled with a particle aggregation model. The results from these model simulations suggested that at lower phage and bacterial concentrations there are not enough collisions to support active disinfection rates, which therefore, limits the conditions and systems where phage based bacterial disinfection is possible. Additionally, the particle aggregation model over predicted the disinfection rates at higher phage and bacterial concentrations of 108 PFU/mL and 108 CFU/mL, suggesting other interactions were occurring at these higher concentrations. Overall, this work highlights the need for the development of alternative models to more accurately describe the dynamics of this system at a variety of phage and bacterial concentrations. Finally, the minimum required hydraulic residence time was calculated for a continuous stirred-tank reactor and a plug flow reactor (PFR) as a function of both the initial phage and bacterial concentrations, which suggested that phage treatment in a PFR is theoretically possible.

In addition to determining disinfection rates, the long-term bacterial growth inhibition potential was determined for a variety of phages with both Gram-negative and Gram-positive bacteria. It was determined, that on average, phages can be used to inhibit bacterial growth for up to 24 h, and that this effect was concentration dependent for various phages at specific time points. Additionally, it was found that a phage cocktail was no more effective at inhibiting bacterial growth over the long-term than the best performing phage in isolation.

Finally, for an industrial application, the use of phages to inhibit invasive Lactobacilli in ethanol fermentations was investigated. It was demonstrated that phage 8014-B2 can achieve a greater than 3-log inactivation of Lactobacillus plantarum during a 48 h fermentation. Additionally, it was shown that phages can be used to protect final product yields and maintain yeast viability. Through modeling the fermentation system with differential equations it was determined that there was a 10 h window in the beginning of the fermentation run, where the addition of phages can be used to protect final product yields, and after 20 h no additional benefit of the phage addition was observed.

In conclusion, this dissertation improved the current methods for designing antisense gene silencing targets for prokaryotic organisms, and characterized phages from an engineering perspective. First, the current design strategy for antisense targets in prokaryotic organisms was improved through the development of an algorithm that minimized the number of off-targets. For the phage work, a framework was developed to predict the disinfection rates in terms of the initial phage and bacterial concentrations. In addition, the long-term bacterial growth inhibition potential of multiple phages was determined for several bacteria. In regard to the phage application, phages were shown to protect both final product yields and yeast concentrations during fermentation. Taken together, this work suggests that the rational design of phage treatment is possible and further work is needed to expand on this foundation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on meetings of the Society for Research into Higher Education’s Student Experience Network over the past three years, the genuinely open research question is posed whether there is one or more undergraduate student experience within English higher education. Answering this question depends on whether what is taught or what is learnt is examined. If the latter, then a unitary student experience can be said to exist only in the narrowest of normative senses. What undergraduates actually learn – defined in the widest sense – is the $64,000 question of research on the student experience. Various ways to answer this question are proposed, including using students to research students. Conceptual tools to apply to findings can be developed from youth studies and cognate disciplines, particularly in relation to student identities and aspirations. Lastly, these proposals are placed in the wider context of comparative models of the varieties of student experience, including those emerging in the UK’s national regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyses the interaction between neoliberal inspired reforms of public services and the mechanisms for achieving public accountability. Where once accountability was exercised through the ballot box, now in the neoliberal age managerial and market based forms of accountability predominate. The analysis identifies resistance from civil society campaigns to the neoliberal restructuring of public services which leads to public accountability (PA) becoming a contested arena. To develop this analysis a re-theorisation of PA, as a relationship where civil society seeks to control the state, is explored in the context of social housing in England over the past thirty years. Central to this analysis is a dialogical analysis of key documents from a social housing regulator and civil society campaign. The analysis shows that the current PA practices are an outcome of both reforms from the government and resistance from civil society (in the shape of tenants’ campaigns). The outcome of which is to tell the story of the changes in PA (and accountability) centring on an analysis of discourse. Thus, the paper moves towards answering the question – what has happened to PA during the neoliberal age?

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiuser diversity (MUDiv) is one of the central concepts in multiuser (MU) systems. In particular, MUDiv allows for scheduling among users in order to eliminate the negative effects of unfavorable channel fading conditions of some users on the system performance. Scheduling, however, consumes energy (e.g., for making users' channel state information available to the scheduler). This extra usage of energy, which could potentially be used for data transmission, can be very wasteful, especially if the number of users is large. In this paper, we answer the question of how much MUDiv is required for energy limited MU systems. Focusing on uplink MU wireless systems, we develop MU scheduling algorithms which aim at maximizing the MUDiv gain. Toward this end, we introduce a new realistic energy model which accounts for scheduling energy and describes the distribution of the total energy between scheduling and data transmission stages. Using the fact that such energy distribution can be controlled by varying the number of active users, we optimize this number by either i) minimizing the overall system bit error rate (BER) for a fixed total energy of all users in the system or ii) minimizing the total energy of all users for fixed BER requirements. We find that for a fixed number of available users, the achievable MUDiv gain can be improved by activating only a subset of users. Using asymptotic analysis and numerical simulations, we show that our approach benefits from MUDiv gains higher than that achievable by generic greedy access algorithm, which is the optimal scheduling method for energy unlimited systems. © 2010 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Health policy directs the management of patients with chronic disease in a country, but evaluating nationwide policies is difficult, not least because of the absence of suitable comparators. This paper examines the management of patients with type 2 diabetes in two demographically comparable populations with different health care systems to see if this represents a viable approach to evaluation.

Methods: A secondary analysis of centralized prescribing databases for 2010 was undertaken to compare the levels and costs of care of patients with type 2 diabetes in Northern Ireland’s National Health Service (NHS) (NI, n = 1.8 million) which has structured care, financial incentives related to diabetes care and an emphasis on generic prescribing, with that of the Republic of Ireland (ROI, n = 4.3 million) where management of diabetes care is guided solely by clinical and other guidelines.

Results: The prevalence of treated type 2 diabetes was 3.59% in NI and 3.09% in ROI, but there were similar and high levels of prescribing of secondary cardiovascular medications. Medication costs per person for anti-diabetic, anti-obesity and cardiovascular medication were 46% higher in ROI than NI, due to differences in levels of generic prescribing.

Conclusions: These different health care systems appear to be producing similar levels of care for patients with type 2 diabetes, although at different levels of cost. The findings question the need for financial incentives in NI and highlight the large cost savings potentially accruing from a greater shift to generic prescribing in ROI. Cross-country comparison, though not without difficulties, may prove a useful adjunct to within-country analysis of policy impact.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose
This article aims to analyze the role of performance management systems (PMS) in supporting public value strategies.

Design/methodology/approach
This article draws on the public value dynamic model by Horner and Hutton (2010). It presents the results of a case study of implementation of a PMS model, the ‘Value Pyramid’ (VP).

Findings
The results stress the need for an improved conceptualization of PMS within public value strategy. Through experimentation using the VP, the case site was able to measure and visualize what it considered public value and reflect on the internal/external causes of both creation and destruction of public value.

Research limitations/implication
This article is limited to just one case study, although in-depth and longitudinal.

Originality/value
This article is one of the first attempting to understand the role of PMS within the public value strategy framework, answering the call of Benington and Moore (2010) to consider public value from an accounting perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Does bound entanglement naturally appear in quantum many-body systems? We address this question by showing the existence of bound-entangled thermal states for harmonic oscillator systems consisting of an arbitrary number of particles. By explicit calculations of the negativity for different partitions, we find a range of temperatures for which no entanglement can be distilled by means of local operations, despite the system being globally entangled. We offer an interpretation of this result in terms of entanglement-area laws, typical of these systems. Finally, we discuss generalizations of this result to other systems, including spin chains.