987 resultados para Personal computing
Resumo:
In the reinsurance market, the risks natural catastrophes pose to portfolios of properties must be quantified, so that they can be priced, and insurance offered. The analysis of such risks at a portfolio level requires a simulation of up to 800 000 trials with an average of 1000 catastrophic events per trial. This is sufficient to capture risk for a global multi-peril reinsurance portfolio covering a range of perils including earthquake, hurricane, tornado, hail, severe thunderstorm, wind storm, storm surge and riverine flooding, and wildfire. Such simulations are both computation and data intensive, making the application of high-performance computing techniques desirable.
In this paper, we explore the design and implementation of portfolio risk analysis on both multi-core and many-core computing platforms. Given a portfolio of property catastrophe insurance treaties, key risk measures, such as probable maximum loss, are computed by taking both primary and secondary uncertainties into account. Primary uncertainty is associated with whether or not an event occurs in a simulated year, while secondary uncertainty captures the uncertainty in the level of loss due to the use of simplified physical models and limitations in the available data. A combination of fast lookup structures, multi-threading and careful hand tuning of numerical operations is required to achieve good performance. Experimental results are reported for multi-core processors and systems using NVIDIA graphics processing unit and Intel Phi many-core accelerators.
Resumo:
Approximate execution is a viable technique for environments with energy constraints, provided that applications are given the mechanisms to produce outputs of the highest possible quality within the available energy budget. This paper introduces a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows developers to structure the computation in different tasks, and to express the relative importance of these tasks for the quality of the end result. For non-significant tasks, the developer can also supply less costly, approximate versions. The target energy consumption for a given execution is specified when the application is launched. A significance-aware runtime system employs an application-specific analytical energy model to decide how many cores to use for the execution, the operating frequency for these cores, as well as the degree of task approximation, so as to maximize the quality of the output while meeting the user-specified energy constraints. Evaluation on a dual-socket 16-core Intel platform using 9 benchmark kernels shows that the proposed framework picks the optimal configuration with high accuracy. Also, a comparison with loop perforation (a well-known compile-time approximation technique), shows that the proposed framework results in significantly higher quality for the same energy budget.
Resumo:
This paper outlines a means of improving the employability skills of first-year university students through a closely integrated model of employer engagement within computer science modules. The outlined approach illustrates how employability skills, including communication, teamwork and time management skills, can be contextualised in a manner that directly relates to student learning but can still be linked forward into employment. The paper tests the premise that developing employability skills early within the curriculum will result in improved student engagement and learning within later modules. The paper concludes that embedding employer participation within first-year models can help relate a distant notion of employability into something of more immediate relevance in terms of how students can best approach learning. Further, by enhancing employability skills early within the curriculum, it becomes possible to improve academic attainment within later modules.
Resumo:
The circumstances in Colombo, Sri Lanka, and in Belfast, Northern Ireland, which led to a) the generalization of luminescent PET (photoinduced electron transfer) sensing/switching as a design tool, b) the construction of a market-leading blood electrolyte analyzer and c) the invention of molecular logic-based computation as an experimental field, are delineated. Efforts to extend the philosophy of these approaches into issues of small object identification, nanometric mapping, animal visual perception and visual art are also outlined.
Resumo:
PURPOSE:
To determine the accuracy of a history of cataract and cataract surgery (self-report and for a sibling), and to determine which demographic, cognitive, and medical factors are predictive of an accurate history.
METHODS:
All participants in the Salisbury Eye Evaluation (SEE) project and their locally resident siblings were questioned about a personal and family history of cataract or cataract surgery. Lens grading at the slit lamp, using standardized photographs and a grading system, was performed for both SEE participants (probands) and their siblings. Cognitive testing and a history of systemic comorbidities were also obtained for all probands.
RESULTS:
Sensitivity of a history of cataract provided on behalf of a sibling was 32%, specificity 98%. The performance was better for a history of cataract surgery: sensitivity 90%, specificity 89%. For self-report of cataract, sensitivity was also low at 55%, with specificity at 77%. Self-report of cataract surgery gave a much better performance: sensitivity 94%, specificity 100%. Different cutoffs in the definition of cataract had little impact. Factors predicting a correct history of cataract included high school or greater education in the proband (odds ratio [OR] = 1.13, 95% confidence interval [CI]1.02-1.25) and younger sibling (but not proband) age (OR = 0.94 for each year of age, 95% CI 0.90-0.99). Gender, race and Mini-Mental Status Examination (MMSE) result were not predictive.
CONCLUSIONS:
Whereas accurate self and family histories for cataract surgery may be obtainable, it is difficult to ascertain cataract status accurately from history alone.
Resumo:
Personal response systems using hardware such as 'clickers' have been around for some time, however their use is often restricted to multiple choice questions (MCQs) and they are therefore used as a summative assessment tool for the individual student. More recent innovations such as 'Socrative' have removed the need for specialist hardware, instead utilising web-based technology and devices common to students, such as smartphones, tablets and laptops. While improving the potential for use in larger classrooms, this also creates the opportunity to pose more engaging open-response questions to students who can 'text in' their thoughts on questions posed in class. This poster will present two applications of the Socrative system in an undergraduate psychology curriculum which aimed to encourage interactive engagement with course content using real-time student responses and lecturer feedback. Data is currently being collected and result will be presented at the conference.
The first application used Socrative to pose MCQs at the end of two modules (a level one Statistics module and level two Individual Differences Psychology module, class size N≈100), with the intention of helping students assess their knowledge of the course. They were asked to rate their self-perceived knowledge of the course on a five-point Likert scale before and after completing the MCQs, as well as their views on the value of the revision session and any issues that had with using the app. The online MCQs remained open between the lecture and the exam, allowing students to revisit the questions at any time during their revision.
This poster will present data regarding the usefulness of the revision MCQs, the metacognitive effect of the MCQs on student's judgements of learning (pre vs post MCQ testing), as well as student engagement with the MCQs between the revision session and the examination. Student opinions on the use of the Socrative system in class will also be discussed.
The second application used Socrative to facilitate a flipped classroom lecture on a level two 'Conceptual Issues in Psychology' module, class size N≈100). The content of this module requires students to think critically about historical and contemporary conceptual issues in psychology and the philosophy of science. Students traditionally struggle with this module due to the emphasis on critical thinking skills, rather than simply the retention of concrete knowledge. To prepare students for the written examination, a flipped classroom lecture was held at the end of the semester. Students were asked to revise their knowledge of a particular area of Psychology by assigned reading, and were told that the flipped lecture would involve them thinking critically about the conceptual issues found in this area. They were informed that questions would be posed by the lecturer in class, and that they would be asked to post their thoughts using the Socrative app for a class discussion. The level of preparation students engaged in for the flipped lecture was measured, as well as qualitative opinions on the usefulness of the session. This poster will discuss the level of student engagement with the flipped lecture, both in terms of preparation for the lecture, and engagement with questions posed during the lecture, as well as the lecturer's experience in facilitating the flipped classroom using the Socrative platform.
Resumo:
Partially ordered preferences generally lead to choices that do not abide by standard expected utility guidelines; often such preferences are revealed by imprecision in probability values. We investigate five criteria for strategy selection in decision trees with imprecision in probabilities: “extensive” Γ-maximin and Γ-maximax, interval dominance, maximality and E-admissibility. We present algorithms that generate strategies for all these criteria; our main contribution is an algorithm for Eadmissibility that runs over admissible strategies rather than over sets of probability distributions.
Resumo:
Introduction
This report details the findings from research conducted across Northern Ireland’s Health and Social Care Trusts during 2015 which examines the current state of Personal and Public Involvement (PPI). This is about how service users, carers and patients engage with staff, management and directors of statutory health and social care organisations. Most statutory health and social care organisations must, under legislation, meet the requirements of PPI. PPI has been part of health and social care policy in Northern Ireland since 2007 and became law two years later with the introduction of the Health and Social Care Reform Act (2009). It is, therefore, timely that PPI is now assessed in this systematic way in order to both examine the aspects which are working well and to highlight those areas where improvements need to be made. As far as possible, this Summary Report is written in an accessible way, avoiding jargon and explaining key research terms, so as to ensure it is widely understood. This is in keeping with established good practice in service user involvement research. This summary, therefore, gives a picture of PPI in Northern Ireland currently. There is also a fuller report which gives a lot more details about the research and findings. Information on this is available from the Public Health Agency and/or the Patient and Client Council.
Resumo:
The first report of the disease (“pine wilt disease”) associated with the pinewood nematode, goes back to 1905, when Yano reported an unusual decline of pines from Nagasaki. For a long time thereafter, the cause of he disease was sought, but without success. Because of the large number of insect species that were usually seen around and on infected trees, it had always been assumed that the causal agent would prove to be one of these. However, in 1971, Kiyohara and Tokushike found a nematode of the genus Bursaphelenchus in infected trees. The nematode found was multiplied on fungal culture, inoculated into healthy trees and then re-isolated from the resulting wilted trees. The subsequent published reports were impressive: this Bursaphelenchus species could kill fully-grown trees within a few months in the warmer areas of Japan, and could destroy complete forests of susceptible pine species within a few years. Pinus densiflora, P. thunbergii und P. luchuensis were particularly affected. In 1972, Mamiya and Kiyohara described the new species of nematode extracted from the wood of diseased pines; it was a named Bursaphelenchus lignicolus. Since 1975, the species has spread to the north of Japan, with the exception of the most northerly prefectures. In 1977, the loss of wood in the west of the country reached 80%. Probably as a result of unusually high summer temperatures and reduced rainfall in the years 1978 and 1979, the losses were more than 2 million m3 per year. From the beginning, B. lignicolus was always considered by Japanese scientists to be an exotic pest. But where did it come from? That this nematode could also cause damage in the USA became clear in 1979 when B. lignicolus was isolated in great numbers from wood of a 39 year-old pine tree (Pinus nigra) in Missouri which had suddenly died after the colour of its needles changed to a reddish-brown colour (Dropkin und Foudin, 2 1979). In 1981, B. lignicolus was synonymised by Nickle et al. with B. xylophilus which had been found for the first time in the USA as far back as 1929, and reported by Steiner and Buhrer in 1934. It had originally been named Aphelenchoides xylophilus, the wood-inhabiting Aphelenchoides but was recognised by Nickle, in 1970,to belong in the genus Bursaphelenchus. Its common name in the USA was the "pine wood nematode" (PWN. After its detection in Missouri, it became known that B. xylophilus was widespread throughout the USA and Canada. It occurred there on native species of conifers where, as a rule, it did not show the symptoms of pine wilt disease unless susceptible species were stressed eg., by high temperature. This fact was an illuminating piece of evidence that North America could be the homeland of PWN. Dwinell (1993) later reported the presence of B. xylophilus in Mexico. The main vector of the PWN in Japan was shown to be the long-horned beetle Monochamus alternatus, belonging to the family Cerambycidae. This beetle lays its eggs in dead or dying trees where the developing larvae then feed in the cambium layer. It was already known in Japan in the 19th century but in the 1930s, it was said to be present in most areas of Japan, but was generally uncommon. However, with the spread of the pine wilt disease, and the resulting increase of weakened trees that could act as breeding sites for beetles, the populations of Monochamus spp. increased significantly In North America, other Monochamus species transmit PWN, and the main vector is M. carolinensis. In Japan, there are also other, less efficient vectors in the genus Monochamus. Possibly, all Monochamus species that breed in conifers can transmit the PWN. The occasional transmission by less efficient species of Monochamus or by some of the many other beetle genera in the bark or wood is of little significance. In Europe, M. galloprovincialis and M. sutor transmits the closely related species B. mucronatus. Some speculate that these two insect species are “standing by” and waiting for the arrival of B. xylophilus. In 1982, the nematode was detected and China. It was first found in dead pines near the Zhongshan Monument of Nanjing (CHENG et. al. 1983); 265 trees were then killed by pine wilt disease. Despite great efforts at eradication in China, the nematode spread further and pine wilt disease has been 3 reported from parts of the provinces of Jiangsu, Anhui, Guangdong, Shandong, Zhejiang and Hubei (YANG, 2003). In 1986, the spread of the PWN to Taiwan was discovered and in 1989, the nematode was reported to be present in the Republic of Korea where it had first been detected in Pinus thunbergii and P. densiflora. It was though to have been introduced with packing material from Japan. PWN was advancing. In 1984, B. xylophilus was found in wood chips imported into Finland from the USA and Canada, and this was the impetus to establish phytosanitary measures to prevent any possible spread into Europe. Finland prohibited the import of coniferous wood chips from these sources, and the other Nordic countries soon followed suit. EPPO (the European and Mediterranean Plant Protection Organization) made a recommendation to its member countries in 1986 to refuse wood imports from infested countries. With its Directive of 1989 (77/93 EEC), the European Community (later called the European Union or EU) recognised the potential danger of B. xylophilus for European forests and imposed restrictions on imports into the Europe. PWN was placed on the quarantine list of the EU and also of other European countries. Later, in 1991, a dispensation was allowed by the Commission of the EU(92/13 EEC) for coniferous wood from North America provided that certain specified requirements were fulfilled that would prevent introduction.
Resumo:
In the modern society, communications and digital transactions are becoming the norm rather than the exception. As we allow networked computing devices into our every-day actions, we build a digital lifestyle where networks and devices enrich our interactions. However, as we move our information towards a connected digital environment, privacy becomes extremely important as most of our personal information can be found in the network. This is especially relevant as we design and adopt next generation networks that provide ubiquitous access to services and content, increasing the impact and pervasiveness of existing networks. The environments that provide widespread connectivity and services usually rely on network protocols that have few privacy considerations, compromising user privacy. The presented work focuses on the network aspects of privacy, considering how network protocols threaten user privacy, especially on next generation networks scenarios. We target the identifiers that are present in each network protocol and support its designed function. By studying how the network identifiers can compromise user privacy, we explore how these threats can stem from the identifier itself and from relationships established between several protocol identifiers. Following the study focused on identifiers, we show that privacy in the network can be explored along two dimensions: a vertical dimension that establishes privacy relationships across several layers and protocols, reaching the user, and a horizontal dimension that highlights the threats exposed by individual protocols, usually confined to a single layer. With these concepts, we outline an integrated perspective on privacy in the network, embracing both vertical and horizontal interactions of privacy. This approach enables the discussion of several mechanisms to address privacy threats on individual layers, leading to architectural instantiations focused on user privacy. We also show how the different dimensions of privacy can provide insight into the relationships that exist in a layered network stack, providing a potential path towards designing and implementing future privacy-aware network architectures.