59 resultados para FITS
Resumo:
The conventional wisdom in the transitional justice literature is that there is no one-size-fits-all approach. This article suggests that this may also be true within a given state. The current paper reports on quantitative and qualitative data from 184 participants in a survey conducted in the Caribbean coast of Colombia. Results suggest widespread support for transitional justice mechanisms – such as perpetrator accountability, public acknowledgement and structural change – but dissatisfaction with national-level initiatives, specifically the 2005 Justice and Peace Law. Yet, despite a distrust of the national government and protracted conflict, individuals report social trust, community cohesion and reliance on local government institutions. These attitudes and behaviours suggest that decentralised transitional justice mechanisms may be more effective in meeting victims' needs. Moreover, analyses indicate that individual preferences are influenced by community factors, such as the presence of demobilised paramilitaries, which can be addressed through more localised approaches to promote peacebuilding. The paper concludes with best practices derived from the findings.
Resumo:
How can we correlate the neural activity in the human brain as it responds to typed words, with properties of these terms (like ‘edible’, ‘fits in hand’)? In short, we want to find latent variables, that jointly explain both the brain activity, as well as the behavioral responses. This is one of many settings of the Coupled Matrix-Tensor Factorization (CMTF) problem.
Can we accelerate any CMTF solver, so that it runs within a few minutes instead of tens of hours to a day, while maintaining good accuracy? We introduce Turbo-SMT, a meta-method capable of doing exactly that: it boosts the performance of any CMTF algorithm, by up to 200x, along with an up to 65 fold increase in sparsity, with comparable accuracy to the baseline.
We apply Turbo-SMT to BrainQ, a dataset consisting of a (nouns, brain voxels, human subjects) tensor and a (nouns, properties) matrix, with coupling along the nouns dimension. Turbo-SMT is able to find meaningful latent variables, as well as to predict brain activity with competitive accuracy.
Resumo:
The comparator account holds that processes of motor prediction contribute to the sense of agency by attenuating incoming sensory information and that disruptions to this process contribute to misattributions of agency in schizophrenia. Over the last 25 years this simple and powerful model has gained widespread support not only as it relates to bodily actions but also as an account of misattributions of agency for inner speech, potentially explaining the etiology of auditory verbal hallucination (AVH). In this paper we provide a detailed analysis of the traditional comparator account for inner speech, pointing out serious problems with the specification of inner speech on which it is based and highlighting inconsistencies in the interpretation of the electrophysiological evidence commonly cited in its favor. In light of these analyses we propose a new comparator account of misattributed inner speech. The new account follows leading models of motor imagery in proposing that inner speech is not attenuated by motor prediction, but rather derived directly from it. We describe how failures of motor prediction would therefore directly affect the phenomenology of inner speech and trigger a mismatch in the comparison between motor prediction and motor intention, contributing to abnormal feelings of agency. We argue that the new account fits with the emerging phenomenological evidence that AVHs are both distinct from ordinary inner speech and heterogeneous. Finally, we explore the possibility that the new comparator account may extend to explain disruptions across a range of imagistic modalities, and outline avenues for future research.
Resumo:
Sylvia Townsend Warner was born in 1893 in Harrow and died in Dorset in 1978. Her writing career was both productive and diverse, spanning poems, short stories, novels, music reviews, a biography, translations of Proust, and a guide to Somerset. But this list, impressive as it is, does not do justice to the idiosyncrasy and heterogeneity of her work. While she is well known mostly for the seven novels she published, those works are all radically different in style and content. Indeed, Townsend Warner's singularity has, it could be argued, made it difficult to place her in the various fields and sub-fields of 20th-century literary studies. She shares as many similarities as differences with the high modernists who dominated the literary landscape of the interwar period. Likewise she fits, yet also resists, the more recent formulations of intermodernist and middlebrow scholarship that have attempted to interrogate and expand the horizons of mid-20th century literature.
Resumo:
BACKGROUND: The past three decades have seen rapid improvements in the diagnosis and treatment of most cancers and the most important contributor has been research. Progress in rare cancers has been slower, not least because of the challenges of undertaking research.
SETTINGS: The International Rare Cancers Initiative (IRCI) is a partnership which aims to stimulate and facilitate the development of international clinical trials for patients with rare cancers. It is focused on interventional--usually randomized--clinical trials with the clear goal of improving outcomes for patients. The key challenges are organisational and methodological. A multi-disciplinary workshop to review the methods used in ICRI portfolio trials was held in Amsterdam in September 2013. Other as-yet unrealised methods were also discussed.
RESULTS: The IRCI trials are each presented to exemplify possible approaches to designing credible trials in rare cancers. Researchers may consider these for use in future trials and understand the choices made for each design.
INTERPRETATION: Trials can be designed using a wide array of possibilities. There is no 'one size fits all' solution. In order to make progress in the rare diseases, decisions to change practice will have to be based on less direct evidence from clinical trials than in more common diseases.
Resumo:
The A-level Mathematics qualification is based on a compulsory set of pure maths modules and a selection of applied maths modules with the pure maths representing two thirds of the assessment. The applied maths section includes mechanics, statistics and (sometimes) decision maths. A combination of mechanics and statistics tends to be the most popular choice by far. The current study aims to understand how maths teachers in secondary education make decisions regarding the curriculum options and offers useful insight to those currently designing the new A-level specifications.
Semi-structured interviews were conducted with A-level maths teachers representing 27 grammar schools across Northern Ireland. Teachers were generally in agreement regarding the importance of pure maths and the balance between pure and applied within the A-level maths curriculum. A wide variety of opinions existed concerning the applied options. While many believe that the basic mechanics-statistics (M1-S1) combination is most accessible, it was also noted that the M1-M2 combination fits neatly alongside A-level physics. Lack of resources, timetabling constraints and competition with other subjects in the curriculum hinder uptake of A-level Further Maths.
Teachers are very conscious of the need to obtain high grades to benefit both their pupils and the school’s reputation. The move to a linear assessment system in England while Northern Ireland retains the modular system is likely to cause some schools to review their choice of exam board although there is disagreement as to whether a modular or linear system is more advantageous for pupils. The upcoming change in the specification offers an opportunity to refresh the assessment also and reduce the number of leading questions. However, teachers note that there are serious issues with GCSE maths and these have implications for A-level.
Resumo:
The fabrication and electrical characterization of Schottky junction diodes have been extensively researched for three-quarters of a century since the original work of Schottky in 1938. This study breaks from the highly standardized regime of such research and provides an alternative methodology that prompts novel, more efficient applications of the adroit Schottky junction in areas such as chemical and thermal sensing. The core departure from standard Schottky diode configuration is that the metal electrode is of comparable or higher resistance than the underlying semiconductor. Further, complete electrical characterization is accomplished through recording four-probe resistance-temperature (R-D-T) characteristics of the device, where electrical sourcing and sensing is done only via the metal electrode and not directly through the semiconductor. Importantly, this results in probing a nominally unbiased junction while eliminating the need for an Ohmic contact to the semiconductor. The characteristic R-D-T plot shows two distinct regions of high (metal) and low (semiconductor) resistances at low and high temperatures, respectively, connected by a crossover region of width, DT, within which there is a large negative temperature coefficient of resistance. The R-D-T characteristic is highly sensitive to the Schottky barrier height; consequently, at a fixed temperature, R-D responds appreciably to small changes in barrier height such as that induced by absorption of a chemical species (e.g., H-2) at the interface. A theoretical model is developed to simulate the R-D-T data and applied to Pd/p-Si and Pt/p-Si Schottky diodes with a range of metal electrode resistance. The analysis gives near-perfect fits to the experimental R-D-T characteristics, yielding the junction properties as fit parameters. The modelling not only helps elucidate the underlying physics but also helps to comprehend the parameter space essential for the discussed applications. Although the primary regime of application is limited to a relatively narrow range (DT) for a given type of diode, the alternative methodology is of universal applicability to all metal-semiconductor combinations forming Schottky contacts. (C) 2015 AIP Publishing LLC.
Resumo:
The increasingly popular disrupted Langmuir–adsorption (DLA) kinetic model of photocatalysis does not contain an explicit function for the dependence of rate on the irradiance, ρ, but instead has a term αρθ, where, α is a constant of the system, and θ is also a constant equal to 1 or 0.5 at low or high ρ values, respectively. Several groups have recently replaced the latter term with an explicit function of the form χ1(−1 + (1 + χ2ρ)1/2), where χ1 and χ2, are constants that can be related to a proposed reaction scheme. Here the latter schemes are investigated, and revised to create a more credible form by assuming an additional hole trapping step. The latter may be the oxidation of water or a surface saturated with O2–. Importantly, this revision suggests that it is only applicable for low quantum yield/efficiency processes. The revised disrupted Langmuir–adsorption model is used to provide good fits to the kinetic data reported for a number of different systems including the photocatalytic oxidation of nitric oxide (NO), phenol (PhOH), and formic acid (FA).
Resumo:
We present the Pan-STARRS1 discovery of the long-lived and blue transient PS1-11af, which was also detected by Galaxy Evolution Explorer with coordinated observations in the near-ultraviolet (NUV) band. PS1-11af is associated with the nucleus of an early type galaxy at redshift z = 0.4046 that exhibits no evidence for star formation or active galactic nucleus activity. Four epochs of spectroscopy reveal a pair of transient broad absorption features in the UV on otherwise featureless spectra. Despite the superficial similarity of these features to P-Cygni absorptions of supernovae (SNe), we conclude that PS1-11af is not consistent with the properties of known types of SNe. Blackbody fits to the spectral energy distribution are inconsistent with the cooling, expanding ejecta of a SN, and the velocities of the absorption features are too high to represent material in homologous expansion near a SN photosphere. However, the constant blue colors and slow evolution of the luminosity are similar to previous optically selected tidal disruption events (TDEs). The shape of the optical light curve is consistent with models for TDEs, but the minimum accreted mass necessary to power the observed luminosity is only 0.002 M, which points to a partial disruption model. A full disruption model predicts higher bolometric luminosities, which would require most of the radiation to be emitted in a separate component at high energies where we lack observations. In addition, the observed temperature is lower than that predicted by pure accretion disk models for TDEs and requires reprocessing to a constant, lower temperature. Three deep non-detections in the radio with the Very Large Array over the first two years after the event set strict limits on the production of any relativistic outflow comparable to Swift J1644+57, even if off-axis.
Resumo:
This paper will explore from a ‘child’s rights perspective’ the ‘right’ of children with autistic spectrum disorder (ASD) to appropriate and meaningful education.Human ‘rights’ principles within international law will be evaluated in relation to how they have been interpreted and applied in relation to achieving this ‘right’. The International Convention of the Rights of the Child (United Nations in Convention on the rights of the child, office of the high commissioner, United Nations, Geneva, 1989) and the convention on the rights of the person with disability (United Nations in Convention on the rights of person’s with disabilities and optional protocol, office of the high commissioner, United Nations, Geneva, 2006) amongst others will be utilised to argue the case for ‘inclusive’educational opportunities to be a ‘right’ of every child on the autistic spectrum. The efficacy of mainstream inclusion is explored, identifying the position that a ‘one size fits all’model of education is not appropriate for all children with ASD.
Resumo:
Libertarian paternalism, as advanced by Cass Sunstein, is seriously flawed, but not primarily for the reasons that most commentators suggest. Libertarian paternalism and its attendant regulatory implications are too libertarian, not too paternalistic, and as a result are in considerable tension with ‘thick’ conceptions of human dignity. We make four arguments. The first is that there is no justification for a presumption in favor of nudging as a default regulatory strategy, as Sunstein asserts. It is ordinarily less effective than mandates; such mandates rarely offend personal autonomy; and the central reliance on cognitive failures in the nudging program is more likely to offend human dignity than the mandates it seeks to replace. Secondly, we argue that nudging as a regulatory strategy fits both overtly and covertly, often insidiously, into a more general libertarian program of political economy. Thirdly, while we are on the whole more concerned to reject the libertarian than the paternalistic elements of this philosophy, Sunstein’s work, both in Why Nudge?, and earlier, fails to appreciate how nudging may be manipulative if not designed with more care than he acknowledges. Lastly, because of these characteristics, nudging might even be subject to legal challenges that would give us the worst of all possible regulatory worlds: a weak regulatory intervention that is liable to be challenged in the courts by well-resourced interest groups. In such a scenario, and contrary to the ‘common sense’ ethos contended for in Why Nudge?, nudges might not even clear the excessively low bar of doing something rather than nothing. Those seeking to pursue progressive politics, under law, should reject nudging in favor of regulation that is more congruent with principles of legality, more transparent, more effective, more democratic, and allows us more fully to act as moral agents. Such a system may have a place for (some) nudging, but not one that departs significantly from how labeling, warnings and the like already function, and nothing that compares with Sunstein’s apparent ambitions for his new movement.
Resumo:
Homomorphic encryption offers potential for secure cloud computing. However due to the complexity of homomorphic encryption schemes, performance of implemented schemes to date have been unpractical. This work investigates the use of hardware, specifically Field Programmable Gate Array (FPGA) technology, for implementing the building blocks involved in somewhat and fully homomorphic encryption schemes in order to assess the practicality of such schemes. We concentrate on the selection of a suitable multiplication algorithm and hardware architecture for large integer multiplication, one of the main bottlenecks in many homomorphic encryption schemes. We focus on the encryption step of an integer-based fully homomorphic encryption (FHE) scheme. We target the DSP48E1 slices available on Xilinx Virtex 7 FPGAs to ascertain whether the large integer multiplier within the encryption step of a FHE scheme could fit on a single FPGA device. We find that, for toy size parameters for the FHE encryption step, the large integer multiplier fits comfortably within the DSP48E1 slices, greatly improving the practicality of the encryption step compared to a software implementation. As multiplication is an important operation in other FHE schemes, a hardware implementation using this multiplier could also be used to improve performance of these schemes.
Resumo:
Difficult-to-treat asthma affects up to 20% of patients with asthma and is associated with significant healthcare cost. It is an umbrella term that defines a heterogeneous clinical problem including incorrect diagnosis, comorbid conditions and treatment non-adherence; when these are effectively addressed, good symptom control is frequently achieved. However, in 3–5% of adults with difficult-to-treat asthma, the problem is severe disease that is unresponsive to currently available treatments. Current treatment guidelines advise the ‘stepwise’ increase of corticosteroids, but it is now recognised that many aspects of asthma are not corticosteroid responsive, and that this ‘one size fits all’ approach does not deliver clinical benefit in many patients and can also lead to side effects. The future of management of severe asthma will involve optimisation with currently available treatments, particularly corticosteroids, including addressing non-adherence and defining an ‘optimised’ corticosteroid dose, allied with the use of ‘add-on’ target-specific novel treatments. This review examines the current status of novel treatments and research efforts to identify novel targets in the era of stratified medicines in severe asthma.
Resumo:
In this work, the general framework in which fits our investigation is that of modeling the dynamics of dust grains therein dusty plasma (complex plasma) in the presence of electromagnetic fields. The generalized discrete complex Ginzburg-Landau equation (DCGLE) is thus obtained to model discrete dynamical structure in dusty plasma with Epstein friction. In the collisionless limit, the equation reduces to the modified discrete nonlinear Schrödinger equation (MDNLSE). The modulational instability phenomenon is studied and we present the criterion of instability in both cases and it is shown that high values of damping extend the instability region. Equations thus obtained highlight the presence of soliton-like excitation in dusty plasma. We studied the generation of soliton in a dusty plasma taking in account the effects of interaction between dust grains and theirs neighbours. Numerical simulations are carried out to show the validity of analytical approach.