774 resultados para Could computing


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Markowitz showed that assets can be combined to produce an 'Efficient' portfolio that will give the highest level of portfolio return for any level of portfolio risk, as measured by the variance or standard deviation. These portfolios can then be connected to generate what is termed an 'Efficient Frontier' (EF). In this paper we discuss the calculation of the Efficient Frontier for combinations of assets, again using the spreadsheet Optimiser. To illustrate the derivation of the Efficient Frontier, we use the data from the Investment Property Databank Long Term Index of Investment Returns for the period 1971 to 1993. Many investors might require a certain specific level of holding or a restriction on holdings in at least some of the assets. Such additional constraints may be readily incorporated into the model to generate a constrained EF with upper and/or lower bounds. This can then be compared with the unconstrained EF to see whether the reduction in return is acceptable. To see the effect that these additional constraints may have, we adopt a fairly typical pension fund profile, with no more than 20% of the total held in Property. The paper shows that it is now relatively easy to use the Optimiser available in at least one spreadsheet (EXCEL) to calculate efficient portfolios for various levels of risk and return, both constrained and unconstrained, so as to be able to generate any number of Efficient Frontiers.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experiments demonstrating human enhancement through the implantation of technology in healthy humans have been performed for over a decade by some academic research groups. More recently, technology enthusiasts have begun to realize the potential of implantable technology such as glass capsule RFID transponders. In this paper it is argued that implantable RFID devices have evolved to the point whereby we should consider the devices themselves as simple computers. Presented here is the infection with a computer virus of an RFID device implanted in a human. Coupled with our developing concept of what constitutes the human body and its boundaries, it is argued that this study has given rise to the world’s first human infected with a computer virus. It has taken the wider academic community some time to agree that meaningful discourse on the topic of implantable technology is of value. As developments in medical technologies point to greater possibilities for enhancement, this shift in thinking is not too soon in coming.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At criminal trial, we demand that those accused of criminal wrongdoing be presumed innocent until proven guilty beyond any reasonable doubt. What are the moral and/or political grounds of this demand? One popular and natural answer to this question focuses on the moral badness or wrongness of convicting and punishing innocent persons, which I call the direct moral grounding. In this essay, I suggest that this direct moral grounding, if accepted, may well have important ramifications for other areas of the criminal justice process, and in particular those parts in which we (through our legislatures and judges) decide how much punishment to distribute to guilty persons. If, as the direct moral grounding suggests, we should prefer under-punishment to over-punishment under conditions of uncertainty, due to the moral seriousness of errors which inappropriately punish persons, then we should also prefer erring on the side of under-punishment when considering how much to punish those who may justly be punished. Some objections to this line of thinking are considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pocket Data Mining (PDM) is our new term describing collaborative mining of streaming data in mobile and distributed computing environments. With sheer amounts of data streams are now available for subscription on our smart mobile phones, the potential of using this data for decision making using data stream mining techniques has now been achievable owing to the increasing power of these handheld devices. Wireless communication among these devices using Bluetooth and WiFi technologies has opened the door wide for collaborative mining among the mobile devices within the same range that are running data mining techniques targeting the same application. This paper proposes a new architecture that we have prototyped for realizing the significant applications in this area. We have proposed using mobile software agents in this application for several reasons. Most importantly the autonomic intelligent behaviour of the agent technology has been the driving force for using it in this application. Other efficiency reasons are discussed in details in this paper. Experimental results showing the feasibility of the proposed architecture are presented and discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The P-found protein folding and unfolding simulation repository is designed to allow scientists to perform analyses across large, distributed simulation data sets. There are two storage components in P-found: a primary repository of simulation data and a data warehouse. Here we demonstrate how grid technologies can support multiple, distributed P-found installations. In particular we look at two aspects, first how grid data management technologies can be used to access the distributed data warehouses; and secondly, how the grid can be used to transfer analysis programs to the primary repositories --- this is an important and challenging aspect of P-found because the data volumes involved are too large to be centralised. The grid technologies we are developing with the P-found system will allow new large data sets of protein folding simulations to be accessed and analysed in novel ways, with significant potential for enabling new scientific discoveries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Postglacial expansion of deciduous oak woodlands of the Zagros—Anti-Taurus Mountains, a major biome of the Near East, was delayed until the middle Holocene at ~6300 cal. yr BP. The current hypotheses explain this delay as a consequence of a regional aridity during the early Holocene, slow migration rates of forest trees, and/or a long history of land use and agro-pastoralism in this region. In the present paper, support is given to a hypothesis that suggests different precipitation seasonalities during the early Holocene compared with the late Holocene. The oak species of the Zagros—Anti-Taurus Mts, particularly Quercus brantii Lindl., are strongly dependent on spring precipitation for regeneration and are sensitive to a long dry season. Detailed analysis of modern atmospheric circulation patterns in SW Asia during the late spring suggests that the Indian Summer Monsoon (ISM) intensification can modify the amount of late spring and/or early summer rainfall in western/northwestern Iran and eastern Anatolia, which could in turn have controlled the development of the Zagros—Anti-Taurus deciduous oak woodlands. During the early Holocene, the northwestward shift of the Inter-Tropical Convergence Zone (ITCZ) could have displaced the subtropical anticyclonic belt or associated high pressure ridges to the northwest. The latter could, in turn, have prevented the southeastward penetration of low pressure systems originating from the North Atlantic and Black Sea regions. Such atmospheric configuration could have reduced or eliminated the spring precipitation creating a typical Mediterranean continental climate characterized by winter-dominated precipitation. This scenario highlights the complexity of biome response to climate system interactions in transitional climatic and biogeographical regions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: This paper aims to design an evaluation method that enables an organization to assess its current IT landscape and provide readiness assessment prior to Software as a Service (SaaS) adoption. Design/methodology/approach: The research employs a mixed of quantitative and qualitative approaches for conducting an IT application assessment. Quantitative data such as end user’s feedback on the IT applications contribute to the technical impact on efficiency and productivity. Qualitative data such as business domain, business services and IT application cost drivers are used to determine the business value of the IT applications in an organization. Findings: The assessment of IT applications leads to decisions on suitability of each IT application that can be migrated to cloud environment. Research limitations/implications: The evaluation of how a particular IT application impacts on a business service is done based on the logical interpretation. Data mining method is suggested in order to derive the patterns of the IT application capabilities. Practical implications: This method has been applied in a local council in UK. This helps the council to decide the future status of the IT applications for cost saving purpose.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global communicationrequirements andloadimbalanceof someparalleldataminingalgorithms arethe major obstacles to exploitthe computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication costin parallel data mining algorithms and, in particular, in the k-means algorithm for cluster analysis. In the straightforward parallel formulation of the k-means algorithm, data and computation loads are uniformly distributed over the processing nodes. This approach has excellent load balancing characteristics that may suggest it could scale up to large and extreme-scale parallel computing systems. However, at each iteration step the algorithm requires a global reduction operationwhichhinders thescalabilityoftheapproach.Thisworkstudiesadifferentparallelformulation of the algorithm where the requirement of global communication is removed, while maintaining the same deterministic nature ofthe centralised algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real-world distributed applications or can be induced by means ofmulti-dimensional binary searchtrees. The approachcanalso be extended to accommodate an approximation error which allows a further reduction ofthe communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing element

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is common practice to freeze dry probiotic bacteria to improve their shelf life. However, the freeze drying process itself can be detrimental to their viability. The viability of probiotics could be maintained if they are administered within a microbially produced biodegradable polymer - poly-γ-glutamic acid (γ-PGA) - matrix. Although the antifreeze activity of γ-PGA is well known, it has not been used for maintaining the viability of probiotic bacteria during freeze drying. The aim of this study was to test the effect of γ-PGA (produced by B. subtilis natto ATCC 15245) on the viability of probiotic bacteria during freeze drying and to test the toxigenic potential of B. subtilis natto. 10% γ-PGA was found to protect Lactobacillus paracasei significantly better than 10% sucrose, whereas it showed comparable cryoprotectant activity to sucrose when it was used to protect Bifidobacterium breve and Bifidobacterium longum. Although γ-PGA is known to be non-toxic, it is crucial to ascertain the toxigenic potential of its source, B. subtilis natto. Presence of six genes that are known to encode for toxins were investigated: three component hemolysin (hbl D/A), three component non-haemolytic enterotoxin (nheB), B. cereus enterotoxin T (bceT), enterotoxin FM (entFM), sphingomyelinase (sph) and phosphatidylcholine-specific phospholipase (piplc). From our investigations, none of these six genes were present in B. subtilis natto. Moreover, haemolytic and lecithinase activities were found to be absent. Our work contributes a biodegradable polymer from a non-toxic source for the cryoprotection of probiotic bacteria, thus improving their survival during the manufacturing process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have optimised the atmospheric radiation algorithm of the FAMOUS climate model on several hardware platforms. The optimisation involved translating the Fortran code to C and restructuring the algorithm around the computation of a single air column. Instead of the existing MPI-based domain decomposition, we used a task queue and a thread pool to schedule the computation of individual columns on the available processors. Finally, four air columns are packed together in a single data structure and computed simultaneously using Single Instruction Multiple Data operations. The modified algorithm runs more than 50 times faster on the CELL’s Synergistic Processing Elements than on its main PowerPC processing element. On Intel-compatible processors, the new radiation code runs 4 times faster. On the tested graphics processor, using OpenCL, we find a speed-up of more than 2.5 times as compared to the original code on the main CPU. Because the radiation code takes more than 60% of the total CPU time, FAMOUS executes more than twice as fast. Our version of the algorithm returns bit-wise identical results, which demonstrates the robustness of our approach. We estimate that this project required around two and a half man-years of work.