156 resultados para Task constraints
Resumo:
We present grizP1 light curves of 146 spectroscopically confirmed Type Ia supernovae (SNe Ia; 0.03 < z < 0.65) discovered during the first 1.5 yr of the Pan-STARRS1 Medium Deep Survey. The Pan-STARRS1 natural photometric system is determined by a combination of on-site measurements of the instrument response function and observations of spectrophotometric standard stars. We find that the systematic uncertainties in the photometric system are currently 1.2% without accounting for the uncertainty in the Hubble Space Telescope Calspec definition of the AB system. A Hubble diagram is constructed with a subset of 113 out of 146 SNe Ia that pass our light curve quality cuts. The cosmological fit to 310 SNe Ia (113 PS1 SNe Ia + 222 light curves from 197 low-z SNe Ia), using only supernovae (SNe) and assuming a constant dark energy equation of state and flatness, yields w = -1.120+0.360-0.206(Stat)+0.2690.291(Sys). When combined with BAO+CMB(Planck)+H0, the analysis yields ΩM = 0.280+0.0130.012 and w = -1.166+0.072-0.069 including all identified systematics. The value of w is inconsistent with the cosmological constant value of -1 at the 2.3σ level. Tension endures after removing either the baryon acoustic oscillation (BAO) or the H0 constraint, though it is strongest when including the H0 constraint. If we include WMAP9 cosmic microwave background (CMB) constraints instead of those from Planck, we find w = -1.124+0.083-0.065, which diminishes the discord to <2σ. We cannot conclude whether the tension with flat ΛCDM is a feature of dark energy, new physics, or a combination of chance and systematic errors. The full Pan-STARRS1 SN sample with ∼three times as many SNe should provide more conclusive results.
Resumo:
We present nebular-phase optical and near-infrared spectroscopy of the Type IIP supernova SN 2012aw combined with non-local thermodynamic equilibrium radiative transfer calculations applied to ejecta from stellar evolution/explosion models. Our spectral synthesis models generally show good agreement with the ejecta from a MZAMS = 15 M⊙progenitor star. The emission lines of oxygen, sodium, and magnesium are all consistent with the nucleosynthesis in a progenitor in the 14-18 M⊙ range.We also demonstrate how the evolution of the oxygen cooling lines of [O I] λ5577, [O I] λ6300, and [O I] λ6364 can be used to constrain the mass of oxygen in the non-molecularly cooled ashes to < 1 M⊙, independent of the mixing in the ejecta. This constraint implies that any progenitor model of initial mass greater than 20 M⊙ would be difficult to reconcile with the observed line strengths. A stellar progenitor of around MZAMS = 15 M⊙ can consistently explain the directly measured luminosity of the progenitor star, the observed nebular spectra, and the inferred pre-supernova mass-loss rate.We conclude that there is still no convincing example of a Type IIP supernova showing the nucleosynthesis products expected from an MZAMS > 20 M⊙ progenitor. © 2014 The Author. Published by Oxford University Press on behalf of the Royal Astronomical Society.
The death of massive stars - II. Observational constraints on the progenitors of Type Ibc supernovae
Resumo:
The progenitors of many Type II core-collapse supernovae (SNe) have now been identified directly on pre-discovery imaging. Here, we present an extensive search for the progenitors of Type Ibc SNe in all available pre-discovery imaging since 1998. There are 12 Type Ibc SNe with no detections of progenitors in either deep ground-based or Hubble Space Telescope archival imaging. The deepest absolute BVR magnitude limits are between -4 and - 5 mag. We compare these limits with the observed Wolf-Rayet population in the Large Magellanic Cloud and estimate a 16 per cent probability that we have failed to detect such a progenitor by chance. Alternatively, the progenitors evolve significantly before core-collapse or we have underestimated the extinction towards the progenitors. Reviewing the relative rates and ejecta mass estimates from light-curve modelling of Ibc SNe, we find both incompatible with Wolf-Rayet stars with initial masses >25 M⊙ being the only progenitors. We present binary evolution models that fit these observational constraints. Stars in binaries with initial masses ≲ 20 M⊙ lose their hydrogen envelopes in binary interactions to become low-mass helium stars. They retain a low-mass hydrogen envelope until ≈104 yr before core-collapse; hence, it is not surprising that Galactic analogues have been difficult to identify.
Resumo:
This paper is concerned with the analysis of the stability of delayed recurrent neural networks. In contrast to the widely used Lyapunov–Krasovskii functional approach, a new method is developed within the integral quadratic constraints framework. To achieve this, several lemmas are first given to propose integral quadratic separators to characterize the original delayed neural network. With these, the network is then reformulated as a special form of feedback-interconnected system by choosing proper integral quadratic constraints. Finally, new stability criteria are established based on the proposed approach. Numerical examples are given to illustrate the effectiveness of the new approach.
Resumo:
Realistic Evaluation of EWS and ALERT: factors enabling and constraining implementation Background The implementation of EWS and ALERT in practice is essential to the success of Rapid Response Systems but is dependent upon nurses utilising EWS protocols and applying ALERT best practice guidelines. To date there is limited evidence on the effectiveness of EWS or ALERT as research has primarily focused on measuring patient outcomes (cardiac arrests, ICU admissions) following the implementation of a Rapid Response Team. Complex interventions in healthcare aimed at changing service delivery and related behaviour of health professionals require a different research approach to evaluate the evidence. To understand how and why EWS and ALERT work, or might not work, research needs to consider the social, cultural and organisational influences that will impact on successful implementation in practice. This requires a research approach that considers both the processes and outcomes of complex interventions, such as EWS and ALERT, implemented in practice. Realistic Evaluation is such an approach and was used to explain the factors that enable and constrain the implementation of EWS and ALERT in practice [1]. Aim The aim of this study was to evaluate factors that enabled and constrained the implementation and service delivery of early warnings systems (EWS) and ALERT in practice in order to provide direction for enabling their success and sustainability. Methods The research design was a multiple case study approach of four wards in two hospitals in Northern Ireland. It followed the principles of realist evaluation research which allowed empirical data to be gathered to test and refine RRS programme theory. This approach used a variety of mixed methods to test the programme theories including individual and focus group interviews, observation and documentary analysis in a two stage process. A purposive sample of 75 key informants participated in individual and focus group interviews. Observation and documentary analysis of EWS compliance data and ALERT training records provided further evidence to support or refute the interview findings. Data was analysed using NVIVO8 to categorise interview findings and SPSS for ALERT documentary data. These findings were further synthesised by undertaking a within and cross case comparison to explain the factors enabling and constraining EWS and ALERT. Results A cross case analysis highlighted similarities, differences and factors enabling or constraining successful implementation across the case study sites. Findings showed that personal (confidence; clinical judgement; personality), social (ward leadership; communication), organisational (workload and staffing issues; pressure from managers to complete EWS audit and targets), educational (constraints on training; no clinical educator on ward) and cultural (routine task delegated) influences impact on EWS and acute care training outcomes. There were also differences noted between medical and surgical wards across both case sites. Conclusions Realist Evaluation allows refinement and development of the RRS programme theory to explain the realities of practice. These refined RRS programme theories are capable of informing the planning of future service provision and provide direction for enabling their success and sustainability. References: 1. McGaughey J, Blackwood B, O’Halloran P, Trinder T. J. & Porter S. (2010) A realistic evaluation of Track and Trigger systems and acute care training for early recognition and management of deteriorating ward–based patients. Journal of Advanced Nursing 66 (4), 923-932. Type of submission: Concurrent session Source of funding: Sandra Ryan Fellowship funded by the School of Nursing & Midwifery, Queen’s University of Belfast
Resumo:
Approximate execution is a viable technique for environments with energy constraints, provided that applications are given the mechanisms to produce outputs of the highest possible quality within the available energy budget. This paper introduces a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows developers to structure the computation in different tasks, and to express the relative importance of these tasks for the quality of the end result. For non-significant tasks, the developer can also supply less costly, approximate versions. The target energy consumption for a given execution is specified when the application is launched. A significance-aware runtime system employs an application-specific analytical energy model to decide how many cores to use for the execution, the operating frequency for these cores, as well as the degree of task approximation, so as to maximize the quality of the output while meeting the user-specified energy constraints. Evaluation on a dual-socket 16-core Intel platform using 9 benchmark kernels shows that the proposed framework picks the optimal configuration with high accuracy. Also, a comparison with loop perforation (a well-known compile-time approximation technique), shows that the proposed framework results in significantly higher quality for the same energy budget.
Resumo:
In a number of species, individuals showing lateralized hand/paw usage (i.e. the preferential use of either the right or left paw) compared to ambilateral individuals have been shown to be more proactive in novel situations. In the current study we used an established test to assess preferential paw usage in dogs (the Kong test) and then compared the performance of ambilateral and lateralized dogs as well as left- vs. right-pawed dogs in a novel manipulative problem solving task. Results showed an equal proportion of ambilateral and lateralized dogs but contrary to predictions non-lateralized dogs were faster at accessing the apparatus in test trials. No differences emerged between right- and left-pawed dogs. Results are discussed in relation to previous studies on lateralization. © 2013 Elsevier B.V.
Resumo:
We explored the brain's ability to quickly prevent a pre-potent but unwanted motor response. To address this, transcranial magnetic stimulation was delivered over the motor cortex (hand representation) to probe excitability changes immediately after somatosensory cues prompted subjects to either move as fast as possible or withhold movement. Our results showed a difference in motor cortical excitability 90 ms post-stimulus contingent on cues to either promote or prevent movement. We suggest that our study design emphasizing response speed coupled with well-defined early probes allowed us to extend upon similar past investigations into the timing of response inhibition.
Resumo:
Scheduling jobs with deadlines, each of which defines the latest time that a job must be completed, can be challenging on the cloud due to incurred costs and unpredictable performance. This problem is further complicated when there is not enough information to effectively schedule a job such that its deadline is satisfied, and the cost is minimised. In this paper, we present an approach to schedule jobs, whose performance are unknown before execution, with deadlines on the cloud. By performing a sampling phase to collect the necessary information about those jobs, our approach delivers the scheduling decision within 10% cost and 16% violation rate when compared to the ideal setting, which has complete knowledge about each of the jobs from the beginning. It is noted that our proposed algorithm outperforms existing approaches, which use a fixed amount of resources by reducing the violation cost by at least two times.
Resumo:
Evidence correlates physical activity, psychological restoration, and social health to proximity to parks and sites of recreation. The purpose of this study was to identify perceived constraints to park use in low-income communities facing significant health disparities, with access to underutilized parks. We used a series of focus groups with families, teens, and older adults in neighborhoods with similar demographic distribution and access to parks over 125 acres in size. Constraints to park use varied across age groups as well as across social ecological levels, with perceived constraints to individuals, user groups, communities, and society. Policies and interventions aimed at increasing park use must specifically address barriers across social ecological levels to be successful.
Resumo:
The speed of manufacturing processes today depends on a trade-off between the physical processes of production, the wider system that allows these processes to operate and the co-ordination of a supply chain in the pursuit of meeting customer needs. Could the speed of this activity be doubled? This paper explores this hypothetical question, starting with examination of a diverse set of case studies spanning the activities of manufacturing. This reveals that the constraints on increasing manufacturing speed have some common themes, and several of these are examined in more detail, to identify absolute limits to performance. The physical processes of production are constrained by factors such as machine stiffness, actuator acceleration, heat transfer and the delivery of fluids, and for each of these, a simplified model is used to analyse the gap between current and limiting performance. The wider systems of production require the co-ordination of resources and push at the limits of human biophysical and cognitive limits. Evidence about these is explored and related to current practice. Out of this discussion, five promising innovations are explored to show examples of how manufacturing speed is increasing—with line arrays of point actuators, parallel tools, tailored application of precision, hybridisation and task taxonomies. The paper addresses a broad question which could be pursued by a wider community and in greater depth, but even this first examination suggests the possibility of unanticipated innovations in current manufacturing practices.
Resumo:
This paper presents an automated design framework for the development of individual part forming tools for a composite stiffener. The framework uses parametrically developed design geometries for both the part and its layup tool. The framework has been developed with a functioning user interface where part / tool combinations are passed to a virtual environment for utility based assessment of their features and assemblability characteristics. The work demonstrates clear benefits in process design methods with conventional design timelines reduced from hours and days to minutes and seconds. The methods developed here were able to produce a digital mock up of a component with its associated layup tool in less than 3 minutes. The virtual environment presenting the design to the designer for interactive assembly planning was generated in 20 seconds. Challenges still exist in determining the level of reality required to provide an effective learning environment in the virtual world. Full representation of physical phenomena such as gravity, part clashes and the representation of standard build functions require further work to represent real physical phenomena more accurately.