25 resultados para Many-to-many-assignment problem

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many theoretical claims about the folk concept of moral responsibility coming from the current literature are indeterminate because researchers do not clearly specify the folk concept of moral responsibility in question. The article pursues a cognitive approach to folk concepts that pays special attention to this indeterminacy problem. After addressing the problem, the article provides evidence on folk attributions of moral responsibility in the case a failed attempt to kill that goes against a specific claim coming from the current literature—that the dimension of causation is part of the structure of the folk concept of moral responsibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theoretical and experimental values to date for the resistances of single molecules commonly disagree by orders of magnitude. By reformulating the transport problem using boundary conditions suitable for correlated many-electron systems, we approach electron transport across molecules from a new standpoint. Application of our correlated formalism to benzene-dithiol gives current-voltage characteristics close to experimental observations. The method can solve the open system quantum many-body problem accurately, treats spin exactly, and is valid beyond the linear response regime.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A method for introducing correlations between electrons and ions that is computationally affordable is described. The central assumption is that the ionic wavefunctions are narrow, which makes possible a moment expansion for the full density matrix. To make the problem tractable we reduce the remaining many-electron problem to a single-electron problem by performing a trace over all electronic degrees of freedom except one. This introduces both one- and two-electron quantities into the equations of motion. Quantities depending on more than one electron are removed by making a Hartree-Fock approximation. Using the first-moment approximation, we perform a number of tight binding simulations of the effect of an electric current on a mobile atom. The classical contribution to the ionic kinetic energy exhibits cooling and is independent of the bias. The quantum contribution exhibits strong heating, with the heating rate proportional to the bias. However, increased scattering of electrons with increasing ionic kinetic energy is not observed. This effect requires the introduction of the second moment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently, a single-symbol decodable transmit strategy based on preprocessing at the transmitter has been introduced to decouple the quasi-orthogonal space-time block codes (QOSTBC) with reduced complexity at the receiver [9]. Unfortunately, it does not achieve full diversity, thus suffering from significant performance loss. To tackle this problem, we propose a full diversity scheme with four transmit antennas in this letter. The proposed code is based on a class of restricted full-rank single-symbol decodable design (RFSDD) and has many similar characteristics as the coordinate interleaved orthogonal designs (CIODs), but with a lower peak-to-average ratio (PAR).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resistance to antimicrobial agents undermines our ability to treat bacterial infections. It attracts intense media and political interest and impacts on personal health and costs to health infrastructures. Bacteria have developed resistance to all licensed antibacterial agents, and their ability to become resistant to unlicensed agents is often demonstrated during the development process. Conventional approaches to antimicrobial development, involving modification of existing agents or production of synthetic derivatives, are unlikely to deliver the range or type of drugs that will be needed to meet all future requirements. Although many companies are seeking novel targets, further radical approaches to both antimicrobial design and the reversal of resistance are now urgently required. In this article, we discuss ‘antisense’ (or ‘antigene’) strategies to inhibit resistance mechanisms at the genetic level. These offer an innovative approach to a global problem and could be used to restore the efficacy of clinically proven agents. Moreover, this strategy has the potential to overcome critical resistances, not only in the so-called ‘superbugs’ (methicillin-resistant Staphylococcus aureus, glycopeptide-resistant enterococci and multidrug-resistant strains of Acinetobacter baumannii, and Pseudomonas aeruginosa), but in resistant strains of any bacterial species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In conditional probabilistic logic programming, given a query, the two most common forms for answering the query are either a probability interval or a precise probability obtained by using the maximum entropy principle. The former can be noninformative (e.g.,interval [0; 1]) and the reliability of the latter is questionable when the priori knowledge isimprecise. To address this problem, in this paper, we propose some methods to quantitativelymeasure if a probability interval or a single probability is sufficient for answering a query. We first propose an approach to measuring the ignorance of a probabilistic logic program with respect to a query. The measure of ignorance (w.r.t. a query) reflects howreliable a precise probability for the query can be and a high value of ignorance suggests that a single probability is not suitable for the query. We then propose a method to measure the probability that the exact probability of a query falls in a given interval, e.g., a second order probability. We call it the degree of satisfaction. If the degree of satisfaction is highenough w.r.t. the query, then the given interval can be accepted as the answer to the query. We also prove our measures satisfy many properties and we use a case study to demonstrate the significance of the measures. © Springer Science+Business Media B.V. 2012

Relevância:

100.00% 100.00%

Publicador:

Resumo:

African coastal regions are expected to experience the highest rates of population growth in coming decades. Fresh groundwater resources in the coastal zone of East Africa (EA) are highly vulnerable to seawater intrusion. Increasing water demand is leading to unsustainable and ill-planned well drilling and abstraction. Wells supplying domestic, industrial and agricultural needs are or have become, in many areas, too saline for use. Climate change, including weather changes and sea level rise, is expected to exacerbate this problem. The multiplicity of physical, demographic and socio-economic driving factors makes this a very challenging issue for management. At present the state and probable evolution of coastal aquifers in EA are not well documented. The UPGro project 'Towards groundwater security in coastal East Africa' brings together teams from Kenya, Tanzania, Comoros Islands and Europe to address this knowledge gap. An integrative multidisciplinary approach, combining the expertise of hydrogeologists, hydrologists and social scientists, is investigating selected sites along the coastal zone in each country. Hydrogeologic observatories have been established in different geologic and climatic settings representative of the coastal EA region, where focussed research will identify the current status of groundwater and identify future threats based on projected demographic and climate change scenarios. Researchers are also engaging with end users as well as local community and stakeholder groups in each area in order to understanding the issues most affecting the communities and searching sustainable strategies for addressing these.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Why did banking compliance fail so badly in the recent financial crisis and why, according to many, does it continue to do so? Rather than point to the lack of oversight of individuals in bank compliance roles, as many commentators do, in this paper I examine in depth the organizational context that surrounded people in such roles. I focus on those compliance personnel who did speak out about risky practices in their banks, who were forced to escalate the problem and 'whistle-blow' to external parties, and who were punished for doing so. Drawing on recent empirical data from a wider study, I argue that the concept of dependence corruption is useful in this setting, and that it can be extended to encompass interpersonal attachments. This, in turn, problematises the concept of dependence corruption because interpersonal attachments in organisational settings are inevitable. The paper engages with recent debates on whether institutional corruption is an appropriate lens for studying private-sector organisations by arguing for a focus on roles, rather than remaining at the level of institutional fields or individual organisations. Finally, the paper contributes to studies on banking compliance in the context of the recent crisis; without a deeper understanding of those who were forced to extremes to simply do their jobs, reform of the banking sector will prove difficult.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper addresses the problem of learning Bayesian network structures from data based on score functions that are decomposable. It describes properties that strongly reduce the time and memory costs of many known methods without losing global optimality guarantees. These properties are derived for different score criteria such as Minimum Description Length (or Bayesian Information Criterion), Akaike Information Criterion and Bayesian Dirichlet Criterion. Then a branch-and-bound algorithm is presented that integrates structural constraints with data in a way to guarantee global optimality. As an example, structural constraints are used to map the problem of structure learning in Dynamic Bayesian networks into a corresponding augmented Bayesian network. Finally, we show empirically the benefits of using the properties with state-of-the-art methods and with the new algorithm, which is able to handle larger data sets than before.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acid whey has become a major concern especially in dairy industry manufacturing Greek yoghurt. Proper disposal of acid whey is essential as it not only increases the BOD of water but also increases the acidity when disposed of in landfill, rendering soil barren and unsuitable for cultivation. Effluent (acid-whey) treatment increases the cost of production. The vast quantities of acid whey that are produced by the dairy industry make the treatment and safe disposal of effluent very difficult. Hence an economical way to handle this problem is very important. Biogenic glycine betaine and trehalose have many applications in food and confectionery industry, medicine, bioprocess industry, agriculture, genetic engineering, and animal feeds (etc.), hence their production is of industrial importance. Here we used the extreme, obligate halophile Actinopolyspora halophila (MTCC 263) for fermentative production of glycine betaine and trehalose from acid whey. Maximum yields were obtained by implementation of a sequential media optimization process, identification and addition of rate-limiting enzyme cofactors via a bioinformatics approach, and manipulation of nitrogen substrate supply. The implications of using glycine as a precursor were also investigated. The core factors that affected production were identified and then optimized using orthogonal array design followed by response surface methodology. The maximum production achieved after complete optimization was 9.07 ± 0.25 g/L and 2.49 ± 0.14 g/L for glycine betaine and trehalose, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Efficient identification and follow-up of astronomical transients is hindered by the need for humans to manually select promising candidates from data streams that contain many false positives. These artefacts arise in the difference images that are produced by most major ground-based time-domain surveys with large format CCD cameras. This dependence on humans to reject bogus detections is unsustainable for next generation all-sky surveys and significant effort is now being invested to solve the problem computationally. In this paper, we explore a simple machine learning approach to real-bogus classification by constructing a training set from the image data of similar to 32 000 real astrophysical transients and bogus detections from the Pan-STARRS1 Medium Deep Survey. We derive our feature representation from the pixel intensity values of a 20 x 20 pixel stamp around the centre of the candidates. This differs from previous work in that it works directly on the pixels rather than catalogued domain knowledge for feature design or selection. Three machine learning algorithms are trained (artificial neural networks, support vector machines and random forests) and their performances are tested on a held-out subset of 25 per cent of the training data. We find the best results from the random forest classifier and demonstrate that by accepting a false positive rate of 1 per cent, the classifier initially suggests a missed detection rate of around 10 per cent. However, we also find that a combination of bright star variability, nuclear transients and uncertainty in human labelling means that our best estimate of the missed detection rate is approximately 6 per cent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As modern power grids move towards becoming a smart grid, there is an increasing reliance on the data that is transmitted and processed by ICT systems. This reliance introduces new digital attack vectors. Many of the proposed approaches that aim to address this problem largely focus on applying well-known ICT security solutions. However, what is needed are approaches that meet the complex concerns of the smart grid as a cyber-physical system. Furthermore, to support the automatic control loops that exist in a power grid, similarly automatic security and resilience mechanisms are needed that rely on minimal operator intervention. The research proposed in this paper aims to develop a framework that ensures resilient smart grid operation in light of successful cyber-attacks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The poor educational outcomes of children in care is a significant concern internationally. Whilst there have been many interventions developed to address this problem, very few of these have been rigorously evaluated. This article presents the findings of a randomised controlled trial that sought to measure the effectiveness of a book gifting programme (the Letterbox Club) that aims to improve literacy skills amongst children aged 7-11 years in foster care. The programme involves children receiving six parcels of books sent through the post over a six-month period. The trial, which ran between April 2013 and June 2014, involved a sample of 116 children in Northern Ireland (56 randomly allocated to the intervention group and 60 to a waiting list control group). Outcome measures focused on reading skills (reading accuracy, comprehension and rate) and attitudes to reading and school. The trial found no evidence that the book-gifting programme had any effect on any of the outcomes measured. Drawing upon some of the emergent themes from the accompanying qualitative process evaluation that sought to determine foster carer/child attitude towards and engagement with the parcels, it is suggested that one plausible reason for the ineffectiveness of the Letterbox Club, as intimated by carers and children (rather than explicitly explored with them), is the lack of support provided to the carers/children in relation to the packs received. Reflective of an ecological model of children’s development, it is recommended that for book-gifting programmes to be effective they need to include a focus on encouraging the direct involvement of foster carers in shared literacy activities with the children using the books that are gifted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Development of reliable methods for optimised energy storage and generation is one of the most imminent challenges in modern power systems. In this paper an adaptive approach to load leveling problem using novel dynamic models based on the Volterra integral equations of the first kind with piecewise continuous kernels. These integral equations efficiently solve such inverse problem taking into account both the time dependent efficiencies and the availability of generation/storage of each energy storage technology. In this analysis a direct numerical method is employed to find the least-cost dispatch of available storages. The proposed collocation type numerical method has second order accuracy and enjoys self-regularization properties, which is associated with confidence levels of system demand. This adaptive approach is suitable for energy storage optimisation in real time. The efficiency of the proposed methodology is demonstrated on the Single Electricity Market of Republic of Ireland and Northern Ireland.