32 resultados para techniques to develop formalisms


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Advances in hardware and software technology enable us to collect, store and distribute large quantities of data on a very large scale. Automatically discovering and extracting hidden knowledge in the form of patterns from these large data volumes is known as data mining. Data mining technology is not only a part of business intelligence, but is also used in many other application areas such as research, marketing and financial analytics. For example medical scientists can use patterns extracted from historic patient data in order to determine if a new patient is likely to respond positively to a particular treatment or not; marketing analysts can use extracted patterns from customer data for future advertisement campaigns; finance experts have an interest in patterns that forecast the development of certain stock market shares for investment recommendations. However, extracting knowledge in the form of patterns from massive data volumes imposes a number of computational challenges in terms of processing time, memory, bandwidth and power consumption. These challenges have led to the development of parallel and distributed data analysis approaches and the utilisation of Grid and Cloud computing. This chapter gives an overview of parallel and distributed computing approaches and how they can be used to scale up data mining to large datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Older adult computer users often lose track of the mouse cursor and so resort to methods such as shaking the mouse or searching the entire screen to find the cursor again. Hence, this paper describes how a standard optical mouse was modified to include a touch sensor, activated by releasing and touching the mouse, which automatically centers the mouse cursor to the screen, potentially making it easier to find a ‘lost’ cursor. Six older adult computer users and six younger computer users were asked to compare the touch sensitive mouse with cursor centering with two alternative techniques for locating the mouse cursor: manually shaking the mouse and using the Windows sonar facility. The time taken to click on a target after a distractor task was recorded, and results show that centering the mouse was the fastest to use, with a 35% improvement over shaking the mouse. Five out of six older participants ranked the touch sensitive mouse with cursor centering as the easiest to use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Time series of global and regional mean Surface Air Temperature (SAT) anomalies are a common metric used to estimate recent climate change. Various techniques can be used to create these time series from meteorological station data. The degree of difference arising from using five different techniques, based on existing temperature anomaly dataset techniques, to estimate Arctic SAT anomalies over land and sea ice were investigated using reanalysis data as a testbed. Techniques which interpolated anomalies were found to result in smaller errors than non-interpolating techniques relative to the reanalysis reference. Kriging techniques provided the smallest errors in estimates of Arctic anomalies and Simple Kriging was often the best kriging method in this study, especially over sea ice. A linear interpolation technique had, on average, Root Mean Square Errors (RMSEs) up to 0.55 K larger than the two kriging techniques tested. Non-interpolating techniques provided the least representative anomaly estimates. Nonetheless, they serve as useful checks for confirming whether estimates from interpolating techniques are reasonable. The interaction of meteorological station coverage with estimation techniques between 1850 and 2011 was simulated using an ensemble dataset comprising repeated individual years (1979-2011). All techniques were found to have larger RMSEs for earlier station coverages. This supports calls for increased data sharing and data rescue, especially in sparsely observed regions such as the Arctic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flexibility of information systems (IS) have been studied to improve the adaption in support of the business agility as the set of capabilities to compete more effectively and adapt to rapid changes in market conditions (Glossary of business agility terms, 2003). However, most of work on IS flexibility has been limited to systems architecture, ignoring the analysis of interoperability as a part of flexibility from the requirements. This paper reports a PhD project, which proposes an approach to develop IS with flexibility features, considering some challenges of flexibility in small and medium enterprises (SMEs) such as the lack of interoperability and the agility of their business. The motivation of this research are the high prices of IS in developing countries and the usefulness of organizational semiotics to support the analysis of requirements for IS. (Liu, 2005).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The failing heart is characterized by complex tissue remodelling involving increased cardiomyocyte death, and impairment of sarcomere function, metabolic activity, endothelial and vascular function, together with increased inflammation and interstitial fibrosis. For years, therapeutic approaches for heart failure (HF) relied on vasodilators and diuretics which relieve cardiac workload and HF symptoms. The introduction in the clinic of drugs interfering with beta-adrenergic and angiotensin signalling have ameliorated survival by interfering with the intimate mechanism of cardiac compensation. Current therapy, though, still has a limited capacity to restore muscle function fully, and the development of novel therapeutic targets is still an important medical need. Recent progress in understanding the molecular basis of myocardial dysfunction in HF is paving the way for development of new treatments capable of restoring muscle function and targeting specific pathological subsets of LV dysfunction. These include potentiating cardiomyocyte contractility, increasing cardiomyocyte survival and adaptive hypertrophy, increasing oxygen and nutrition supply by sustaining vessel formation, and reducing ventricular stiffness by favourable extracellular matrix remodelling. Here, we consider drugs such as omecamtiv mecarbil, nitroxyl donors, cyclosporin A, SERCA2a (sarcoplasmic/endoplasmic Ca(2 +) ATPase 2a), neuregulin, and bromocriptine, all of which are currently in clinical trials as potential HF therapies, and discuss novel molecular targets with potential therapeutic impact that are in the pre-clinical phases of investigation. Finally, we consider conceptual changes in basic science approaches to improve their translation into successful clinical applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are a range of studies based in the low carbon arena which use various ‘futures’- based techniques as ways of exploring uncertainties. These techniques range from ‘scenarios’ and ‘roadmaps’ through to ‘transitions’ and ‘pathways’ as well as ‘vision’-based techniques. The overall aim of the paper is therefore to compare and contrast these techniques to develop a simple working typology with the further objective of identifying the implications of this analysis for RETROFIT 2050. Using recent examples of city-based and energy-based studies throughout, the paper compares and contrasts these techniques and finds that the distinctions between them have often been blurred in the field of low carbon. Visions, for example, have been used in both transition theory and futures/Foresight methods, and scenarios have also been used in transition-based studies as well as futures/Foresight studies. Moreover, Foresight techniques which capture expert knowledge and map existing knowledge to develop a set of scenarios and roadmaps which can inform the development of transitions and pathways can not only help potentially overcome any ‘disconnections’ that may exist between the social and the technical lenses in which such future trajectories are mapped, but also promote a strong ‘co-evolutionary’ content.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

SOA (Service Oriented Architecture), workflow, the Semantic Web, and Grid computing are key enabling information technologies in the development of increasingly sophisticated e-Science infrastructures and application platforms. While the emergence of Cloud computing as a new computing paradigm has provided new directions and opportunities for e-Science infrastructure development, it also presents some challenges. Scientific research is increasingly finding that it is difficult to handle “big data” using traditional data processing techniques. Such challenges demonstrate the need for a comprehensive analysis on using the above mentioned informatics techniques to develop appropriate e-Science infrastructure and platforms in the context of Cloud computing. This survey paper describes recent research advances in applying informatics techniques to facilitate scientific research particularly from the Cloud computing perspective. Our particular contributions include identifying associated research challenges and opportunities, presenting lessons learned, and describing our future vision for applying Cloud computing to e-Science. We believe our research findings can help indicate the future trend of e-Science, and can inform funding and research directions in how to more appropriately employ computing technologies in scientific research. We point out the open research issues hoping to spark new development and innovation in the e-Science field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. The UK Biodiversity Action Plan (UKBAP) identifies invertebrate species in danger of national extinction. For many of these species, targets for recovery specify the number of populations that should exist by a specific future date but offer no procedure to plan strategically to achieve the target for any species. 2. Here we describe techniques based upon geographic information systems (GIS) that produce conservation strategy maps (CSM) to assist with achieving recovery targets based on all available and relevant information. 3. The heath fritillary Mellicta athalia is a UKBAP species used here to illustrate the use of CSM. A phase 1 habitat survey was used to identify habitat polygons across the county of Kent, UK. These were systematically filtered using relevant habitat, botanical and autecological data to identify seven types of polygon, including those with extant colonies or in the vicinity of extant colonies, areas managed for conservation but without colonies, and polygons that had the appropriate habitat structure and may therefore be suitable for reintroduction. 4. Five clusters of polygons of interest were found across the study area. The CSM of two of them are illustrated here: the Blean Wood complex, which contains the existing colonies of heath fritillary in Kent, and the Orlestone Forest complex, which offers opportunities for reintroduction. 5. Synthesis and applications. Although the CSM concept is illustrated here for the UK, we suggest that CSM could be part of species conservation programmes throughout the world. CSM are dynamic and should be stored in electronic format, preferably on the world-wide web, so that they can be easily viewed and updated. CSM can be used to illustrate opportunities and to develop strategies with scientists and non-scientists, enabling the engagement of all communities in a conservation programme. CSM for different years can be presented to illustrate the progress of a plan or to provide continuous feedback on how a field scenario develops.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Previous attempts to apply statistical models, which correlate nutrient intake with methane production, have been of limited. value where predictions are obtained for nutrient intakes and diet types outside those. used in model construction. Dynamic mechanistic models have proved more suitable for extrapolation, but they remain computationally expensive and are not applied easily in practical situations. The first objective of this research focused on employing conventional techniques to generate statistical models of methane production appropriate to United Kingdom dairy systems. The second objective was to evaluate these models and a model published previously using both United Kingdom and North American data sets. Thirdly, nonlinear models were considered as alternatives to the conventional linear regressions. The United Kingdom calorimetry data used to construct the linear models also were used to develop the three. nonlinear alternatives that were ball of modified Mitscherlich (monomolecular) form. Of the linear models tested,, an equation from the literature proved most reliable across the full range of evaluation data (root mean square prediction error = 21.3%). However, the Mitscherlich models demonstrated the greatest degree of adaptability across diet types and intake level. The most successful model for simulating the independent data was a modified Mitscherlich equation with the steepness parameter set to represent dietary starch-to-ADF ratio (root mean square prediction error = 20.6%). However, when such data were unavailable, simpler Mitscherlich forms relating dry matter or metabolizable energy intake to methane production remained better alternatives relative to their linear counterparts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Methodology used to measure in vitro gas production is reviewed to determine impacts of sources of variation on resultant gas production profiles (GPP). Current methods include measurement of gas production at constant pressure (e.g., use of gas tight syringes), a system that is inexpensive, but may be less sensitive than others thereby affecting its suitability in some situations. Automated systems that measure gas production at constant volume allow pressure to accumulate in the bottle, which is recorded at different times to produce a GPP, and may result in sufficiently high pressure that solubility of evolved gases in the medium is affected, thereby resulting in a recorded volume of gas that is lower than that predicted from stoichiometric calculations. Several other methods measure gas production at constant pressure and volume with either pressure transducers or sensors, and these may be manual, semi-automated or fully automated in operation. In these systems, gas is released as pressure increases, and vented gas is recorded. Agitating the medium does not consistently produce more gas with automated systems, and little or no effect of agitation was observed with manual systems. The apparatus affects GPP, but mathematical manipulation may enable effects of apparatus to be removed. The amount of substrate affects the volume of gas produced, but not rate of gas production, provided there is sufficient buffering capacity in the medium. Systems that use a very small amount of substrate are prone to experimental error in sample weighing. Effect of sample preparation on GPP has been found to be important, but further research is required to determine the optimum preparation that mimics animal chewing. Inoculum is the single largest source of variation in measuring GPP, as rumen fluid is variable and sampling schedules, diets fed to donor animals and ratios of rumen fluid/medium must be selected such that microbial activity is sufficiently high that it does not affect rate and extent of fermentation. Species of donor animal may also cause differences in GPP. End point measures can be mathematically manipulated to account for species differences, but rates of fermentation are not related. Other sources of inocula that have been used include caecal fluid (primarily for investigating hindgut fermentation in monogastrics), effluent from simulated rumen fermentation (e.g., 'Rusitec', which was as variable as rumen fluid), faeces, and frozen or freeze-dried rumen fluid (which were both less active than fresh rumen fluid). Use of mixtures of cell-free enzymes, or pure cultures of bacteria, may be a way of increasing GPP reproducibility, while reducing reliance on surgically modified animals. However, more research is required to develop these inocula. A number of media have been developed which buffer the incubation and provide relevant micro-nutrients to the microorganisms. To date, little research has been completed on relationships between the composition of the medium and measured GPP. However, comparing GPP from media either rich in N or N-free, allows assessment of contributions of N containing compounds in the sample. (c) 2005 Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several in vitro and in vivo experiments were conducted to develop an effective technique for culturing potential fungal antagonists (isolates of Trichoderma harzianum, Dactylium dendroides, Chaetomium olivaceum and one unidentified fungus) selected for activity against Armillaria mellea. The antagonists were inoculated onto (1) live spawn of the oyster mu shroom (Pleurotus ostreatus), (2) extra-moistened or sucrose-enriched mushroom composts containing living or autoclaved mycelia of P. ostreatus or Agaricus bisporus (button mushroom), (3) pasteurized compost with or without A. bisporus mycelium, wheat bran, wheat germ and (4) spent mushroom composts with living mycelia of A. bisporus, P. ostreatus or Lentinus edodes (the Shiitake mushroom). In one experiment, a representative antagonist (isolate Th2 of T. harzianum) was grown together with the A. bisporus mycelium, while in another one, the antagonist was first grown on wheat germ or wheat bran and then on mushroom compost with living mycelium of A. bisporus. Some of the carrier substrates were then added to the roots of potted strawberry plants in the glasshouse to evaluate their effectiveness against the disease. The antagonists failed to grow on the spawn of P. ostreatus even after reinoculations and prolonged incubation. Providing extra moisture or sucrose enrichment also did not improve the growth of Th2 on mushroom composts in the presence of living mycelia of A. bisporus or P. ostreatus. The antagonist, however, grew rapidly and extensively on mushroom compost with autoclaved mycelia, and also on wheat germ and wheat bran. Colonization of the substrates by the antagonist was positively correlated with its effectiveness in the glasshouse studies. Whereas only 33.3% of the inoculated control plants survived in one experiment monitored for 560 days, 100% survival was achieved when Th2 was applied on wheat germ or wheat bran. Growth of the antagonist alone on pasteurized or sterilized compost (without A. bisporus mycelia) and simultaneous growth of the antagonist and mushroom on pasteurized compost did not improve survival over the inoculated controls, but growth over mushroom compost with the living mycelium resulted in 50% survival rate. C. olivaceum isolate Co was the most effective, resulting in overall survival rate of 83.3% compared with only 8.3% for the inoculated and 100% for the uninoculated (healthy) controls. This antagonist gave the highest survival rate of 100% on spent mushroom compost with L. edodes. T harzianum isolate Th23, with 75% survival rate, was the most effective on spent mushroom compost with P. ostreatus, while D. dendroides isolate SP resulted in equal survival rates of 50% on all the three mushroom composts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. The UK Biodiversity Action Plan (UKBAP) identifies invertebrate species in danger of national extinction. For many of these species, targets for recovery specify the number of populations that should exist by a specific future date but offer no procedure to plan strategically to achieve the target for any species. 2. Here we describe techniques based upon geographic information systems (GIS) that produce conservation strategy maps (CSM) to assist with achieving recovery targets based on all available and relevant information. 3. The heath fritillary Mellicta athalia is a UKBAP species used here to illustrate the use of CSM. A phase 1 habitat survey was used to identify habitat polygons across the county of Kent, UK. These were systematically filtered using relevant habitat, botanical and autecological data to identify seven types of polygon, including those with extant colonies or in the vicinity of extant colonies, areas managed for conservation but without colonies, and polygons that had the appropriate habitat structure and may therefore be suitable for reintroduction. 4. Five clusters of polygons of interest were found across the study area. The CSM of two of them are illustrated here: the Blean Wood complex, which contains the existing colonies of heath fritillary in Kent, and the Orlestone Forest complex, which offers opportunities for reintroduction. 5. Synthesis and applications. Although the CSM concept is illustrated here for the UK, we suggest that CSM could be part of species conservation programmes throughout the world. CSM are dynamic and should be stored in electronic format, preferably on the world-wide web, so that they can be easily viewed and updated. CSM can be used to illustrate opportunities and to develop strategies with scientists and non-scientists, enabling the engagement of all communities in a conservation programme. CSM for different years can be presented to illustrate the progress of a plan or to provide continuous feedback on how a field scenario develops.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Earth-directed coronal mass ejection (CME) of 8 April 2010 provided an opportunity for space weather predictions from both established and developmental techniques to be made from near–real time data received from the SOHO and STEREO spacecraft; the STEREO spacecraft provide a unique view of Earth-directed events from outside the Sun-Earth line. Although the near–real time data transmitted by the STEREO Space Weather Beacon are significantly poorer in quality than the subsequently downlinked science data, the use of these data has the advantage that near–real time analysis is possible, allowing actual forecasts to be made. The fact that such forecasts cannot be biased by any prior knowledge of the actual arrival time at Earth provides an opportunity for an unbiased comparison between several established and developmental forecasting techniques. We conclude that for forecasts based on the STEREO coronagraph data, it is important to take account of the subsequent acceleration/deceleration of each CME through interaction with the solar wind, while predictions based on measurements of CMEs made by the STEREO Heliospheric Imagers would benefit from higher temporal and spatial resolution. Space weather forecasting tools must work with near–real time data; such data, when provided by science missions, is usually highly compressed and/or reduced in temporal/spatial resolution and may also have significant gaps in coverage, making such forecasts more challenging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Real estate depreciation continues to be a critical issue for investors and the appraisal profession in the UK in the 1990s. Depreciation-sensitive cash flow models have been developed, but there is a real need to develop further empirical methodologies to determine rental depreciation rates for input into these models. Although building quality has been found to be an important explanatory variable in depreciation it is very difficult to incorporate it into such models or to analyse it retrospectively. It is essential to examine previous depreciation research from real estate and economics in the USA and UK to understand the issues in constructing a valid and pragmatic way of calculating rental depreciation. Distinguishing between 'depreciation' and 'obsolescence' is important, and the pattern of depreciation in any study can be influenced by such factors as the type (longitudinal or crosssectional) and timing of the study, and the market state. Longitudinal studies can analyse change more directly than cross-sectional studies. Any methodology for calculating rental depreciation rate should be formulated in the context of such issues as 'censored sample bias', 'lemons' and 'filtering', which have been highlighted in key US literature from the field of economic depreciation. Property depreciation studies in the UK have tended to overlook this literature, however. Although data limitations and constraints reduce the ability of empirical property depreciation work in the UK to consider these issues fully, 'averaging' techniques and ordinary least squares (OLS) regression can both provide a consistent way of calculating rental depreciation rates within a 'cohort' framework.