374 resultados para OpenFlow, SDN, Software-Defined Networking, Cloud
Resumo:
Hedging against tail events in equity markets has been forcefully advocated in the aftermath of recent global financial crisis. Whether this is beneficial to long horizon investors like employees enrolled in defined contribution (DC) plans, however, has been subject to criticism. We conduct historical simulation since 1928 to examine the effectiveness of active and passive tail risk hedging using out of money put options for hypothetical equity portfolios of DC plan participants with 20 years to retirement. Our findings show that the cost of tail hedging exceeds the benefits for a majority of the plan participants during the sample period. However, for a significant number of simulations, hedging result in superior outcomes relative to an unhedged position. Active tail hedging is more effective when employees confront several panic-driven periods characterized by short and sharp market swings in the equity markets over the investment horizon. Passive hedging, on the other hand, proves beneficial when they encounter an extremely rare event like the Great Depression as equity markets go into deep and prolonged decline.
Resumo:
Today’s information systems log vast amounts of data. These collections of data (implicitly) describe events (e.g. placing an order or taking a blood test) and, hence, provide information on the actual execution of business processes. The analysis of such data provides an excellent starting point for business process improvement. This is the realm of process mining, an area which has provided a repertoire of many analysis techniques. Despite the impressive capabilities of existing process mining algorithms, dealing with the abundance of data recorded by contemporary systems and devices remains a challenge. Of particular importance is the capability to guide the meaningful interpretation of “oceans of data” by process analysts. To this end, insights from the field of visual analytics can be leveraged. This article proposes an approach where process states are reconstructed from event logs and visualised in succession, leading to an animated history of a process. This approach is customisable in how a process state, partially defined through a collection of activity instances, is visualised: one can select a map and specify a projection of events on this map based on the properties of the events. This paper describes a comprehensive implementation of the proposal. It was realised using the open-source process mining framework ProM. Moreover, this paper also reports on an evaluation of the approach conducted with Suncorp, one of Australia’s largest insurance companies.
Resumo:
Software as a Service (SaaS) can provide significant benefits to small and medium enterprises (SMEs) due to advantages like ease of access, 7*24 availability, and utility pricing. However, underlying the SaaS delivery model is often the assumption that SMEs will directly interact with the SaaS vendor and use a self-service approach. In practice, we see the rise of SaaS intermediaries who can support SMEs with sourcing and leveraging SaaS. This paper reports on the roles of intermediaries and how they support SMEs with using SaaS. We conducted an empirical study of two SaaS intermediaries and analysed their business models, in particular their value propositions. We identified orientation (technology or customer) and alignment (operational or strategic) as themes for understanding their roles. The contributions of this paper include: (1) the identification and description of SaaS intermediaries for SMEs based on an empirical study and (2) understanding the different roles of SaaS intermediaries, in particular a more basic role based on technology orientation and operational alignment and a more value adding role based on customer orientation and strategic alignment. We propose that SaaS intermediaries can address SaaS adoption and implementation challenges of SMEs by playing a basic role and can also aim to support SMEs in creating business value with SaaS based solutions by playing an added value role.
Resumo:
This thesis examines perceptions of advertising on social networking sites (SNS), in particular consumers' privacy concerns, advertising engagement and advertising avoidance. It contributes to the understanding of social media by providing results of a longitudinal investigation of consumer perceptions of advertising, a topography of engagement and avoidance triggers and a three dimensional model of advertising avoidance on SNS. This research used a mixed methodology, employing Critical Incident Technique, In-depth interviews and online surveys.
Resumo:
Purpose The aim of the study was to determine the association, agreement, and detection capability of manual, semiautomated, and fully automated methods of corneal nerve fiber length (CNFL) quantification of the human corneal subbasal nerve plexus (SNP). Methods Thirty-three participants with diabetes and 17 healthy controls underwent laser scanning corneal confocal microscopy. Eight central images of the SNP were selected for each participant and analyzed using manual (CCMetrics), semiautomated (NeuronJ), and fully automated (ACCMetrics) software to quantify the CNFL. Results For the entire cohort, mean CNFL values quantified by CCMetrics, NeuronJ, and ACCMetrics were 17.4 ± 4.3 mm/mm2, 16.0 ± 3.9 mm/mm2, and 16.5 ± 3.6 mm/mm2, respectively (P < 0.01). CNFL quantified using CCMetrics was significantly higher than those obtained by NeuronJ and ACCMetrics (P < 0.05). The 3 methods were highly correlated (correlation coefficients 0.87–0.98, P < 0.01). The intraclass correlation coefficients were 0.87 for ACCMetrics versus NeuronJ and 0.86 for ACCMetrics versus CCMetrics. Bland–Altman plots showed good agreement between the manual, semiautomated, and fully automated analyses of CNFL. A small underestimation of CNFL was observed using ACCMetrics with increasing the amount of nerve tissue. All 3 methods were able to detect CNFL depletion in diabetic participants (P < 0.05) and in those with peripheral neuropathy as defined by the Toronto criteria, compared with healthy controls (P < 0.05). Conclusions Automated quantification of CNFL provides comparable neuropathy detection ability to manual and semiautomated methods. Because of its speed, objectivity, and consistency, fully automated analysis of CNFL might be advantageous in studies of diabetic neuropathy.
Resumo:
Cloud Computing, based on early virtual computer concepts and technologies, is now itself a maturing technology in the marketplace and it has revolutionized the IT industry, being the powerful platform that many businesses are choosing to migrate their in-premises IT services onto. Cloud solution has the potential to reduce the capital and operational expenses associated with deploying IT services on their own. In this study, we have implemented our own private cloud solution, infrastructure as a service (IaaS), using the OpenStack platform with high availability and a dynamic resource allocation mechanism. Besides, we have hosted unified communication as a service (UCaaS) in the underlying IaaS and successfully tested voice over IP (VoIP), video conferencing, voice mail and instant messaging (IM) with clients located at the remote site. The proposed solution has been developed in order to give advice to bussinesses that want to build their own cloud environment, IaaS and host cloud services and applicatons in the cloud. This paper also aims at providing an alternate option for proprietary cloud solutions for service providers to consider.
Resumo:
Critical to the research of urban morphologists is the availability of historical records that document the urban transformation of the study area. However, thus far little work has been done towards an empirical approach to the validation of archival data in this field. Outlined in this paper, therefore, is a new methodology for validating the accuracy of archival records and mapping data, accrued through the process of urban morphological research, so as to establish a reliable platform from which analysis can proceed. The paper particularly addresses the problems of inaccuracies in existing curated historical information, as well as errors in archival research by student assistants, which together give rise to unacceptable levels of uncertainty in the documentation. The paper discusses the problems relating to the reliability of historical information, demonstrates the importance of data verification in urban morphological research, and proposes a rigorous method for objective testing of collected archival data through the use of qualitative data analysis software.
Resumo:
Intramedullary nailing is the standard fixation method for displaced diaphyseal fractures of the tibia. An optimal nail design should both facilitate insertion and anatomically fit the bone geometry at its final position in order to reduce the risk of stress fractures and malalignments. Due to the nonexistence of suitable commercial software, we developed a software tool for the automated fit assessment of nail designs. Furthermore, we demonstrated that an optimised nail, which fits better at the final position, is also easier to insert. Three-dimensional models of two nail designs and 20 tibiae were used. The fitting was quantified in terms of surface area, maximum distance, sum of surface areas and sum of maximum distances by which the nail was protruding into the cortex. The software was programmed to insert the nail into the bone model and to quantify the fit at defined increment levels. On average, the misfit during the insertion in terms of the four fitting parameters was smaller for the Expert Tibial Nail Proximal bend (476.3 mm2, 1.5 mm, 2029.8 mm2, 6.5 mm) than the Expert Tibial Nail (736.7 mm2, 2.2 mm, 2491.4 mm2, 8.0 mm). The differences were statistically significant (p ≤ 0.05). The software could be used by nail implant manufacturers for the purpose of implant design validation.
Resumo:
Neu-Model, an ongoing project aimed at developing a neural simulation environment that is extremely computationally powerful and flexible, is described. It is shown that the use of good Software Engineering techniques in Neu-Model’s design and implementation is resulting in a high performance system that is powerful and flexible enough to allow rigorous exploration of brain function at a variety of conceptual levels.
Resumo:
For the past few years, research works on the topic of secure outsourcing of cryptographic computations has drawn significant attention from academics in security and cryptology disciplines as well as information security practitioners. One main reason for this interest is their application for resource constrained devices such as RFID tags. While there has been significant progress in this domain since Hohenberger and Lysyanskaya have provided formal security notions for secure computation delegation, there are some interesting challenges that need to be solved that can be useful towards a wider deployment of cryptographic protocols that enable secure outsourcing of cryptographic computations. This position paper brings out these challenging problems with RFID technology as the use case together with our ideas, where applicable, that can provide a direction towards solving the problems.
Resumo:
Iterative computational models have been used to investigate the regulation of bone fracture healing by local mechanical conditions. Although their predictions replicate some mechanical responses and histological features, they do not typically reproduce the predominantly radial hard callus growth pattern observed in larger mammals. We hypothesised that this discrepancy results from an artefact of the models’ initial geometry. Using axisymmetric finite element models, we demonstrated that pre-defining a field of soft tissue in which callus may develop introduces high deviatoric strains in the periosteal region adjacent to the fracture. These bone-inhibiting strains are not present when the initial soft tissue is confined to a thin periosteal layer. As observed in previous healing models, tissue differentiation algorithms regulated by deviatoric strain predicted hard callus forming remotely and growing towards the fracture. While dilatational strain regulation allowed early bone formation closer to the fracture, hard callus still formed initially over a broad area, rather than expanding over time. Modelling callus growth from a thin periosteal layer successfully predicted the initiation of hard callus growth close to the fracture site. However, these models were still susceptible to elevated deviatoric strains in the soft tissues at the edge of the hard callus. Our study highlights the importance of the initial soft tissue geometry used for finite element models of fracture healing. If this cannot be defined accurately, alternative mechanisms for the prediction of early callus development should be investigated.
Resumo:
Porosity is one of the key parameters of the macroscopic structure of porous media, generally defined as the ratio of the free spaces occupied (by the volume of air) within the material to the total volume of the material. Porosity is determined by measuring skeletal volume and the envelope volume. Solid displacement method is one of the inexpensive and easy methods to determine the envelope volume of a sample with an irregular shape. In this method, generally glass beads are used as a solid due to their uniform size, compactness and fluidity properties. The smaller size of the glass beads means that they enter into the open pores which have a larger diameter than the glass beads. Although extensive research has been carried out on porosity determination using displacement method, no study exists which adequately reports micro-level observation of the sample during measurement. This study set out with the aim of assessing the accuracy of solid displacement method of bulk density measurement of dried foods by micro-level observation. Solid displacement method of porosity determination was conducted using a cylindrical vial (cylindrical plastic container) and 57 µm glass beads in order to measure the bulk density of apple slices at different moisture contents. A scanning electron microscope (SEM), a profilometer and ImageJ software were used to investigate the penetration of glass beads into the surface pores during the determination of the porosity of dried food. A helium pycnometer was used to measure the particle density of the sample. Results show that a significant number of pores were large enough to allow the glass beads to enter into the pores, thereby causing some erroneous results. It was also found that coating the dried sample with appropriate coating material prior to measurement can resolve this problem.
Resumo:
This article provides a general review of the literature on the nature and role of empathy in social interaction for information professionals working in a variety of information and knowledge environments. Relational agency theory (Edwards, 2005) is used asa framework to re-conceptualize education for empathic social interaction between information professionals and their clients. Past, present and future issues relevant to empathic interaction in information and knowledge management are discussed in the context of three shifts identified from the literature: (a) the continued increase in communication channels, both physical and virtual, for reference, information and re-search services, (b) the transition from the information age to the conceptual age and(c) the growing need for understanding of the affective paradigm in the information and knowledge professions. Findings from the literature review on the relationships between empathy and information behavior, social networking, knowledge management and information and knowledge services are presented. Findings are discussed in relation to the development of guidelines for the affective education and training of information and knowledge professionals and the potential use of virtual learning software such as Second Life in developing empathic communication skills
Resumo:
Aim: To quantify the consequences of major threats to biodiversity, such as climate and land-use change, it is important to use explicit measures of species persistence, such as extinction risk. The extinction risk of metapopulations can be approximated through simple models, providing a regional snapshot of the extinction probability of a species. We evaluated the extinction risk of three species under different climate change scenarios in three different regions of the Mexican cloud forest, a highly fragmented habitat that is particularly vulnerable to climate change. Location: Cloud forests in Mexico. Methods: Using Maxent, we estimated the potential distribution of cloud forest for three different time horizons (2030, 2050 and 2080) and their overlap with protected areas. Then, we calculated the extinction risk of three contrasting vertebrate species for two scenarios: (1) climate change only (all suitable areas of cloud forest through time) and (2) climate and land-use change (only suitable areas within a currently protected area), using an explicit patch-occupancy approximation model and calculating the joint probability of all populations becoming extinct when the number of remaining patches was less than five. Results: Our results show that the extent of environmentally suitable areas for cloud forest in Mexico will sharply decline in the next 70 years. We discovered that if all habitat outside protected areas is transformed, then only species with small area requirements are likely to persist. With habitat loss through climate change only, high dispersal rates are sufficient for persistence, but this requires protection of all remaining cloud forest areas. Main conclusions: Even if high dispersal rates mitigate the extinction risk of species due to climate change, the synergistic impacts of changing climate and land use further threaten the persistence of species with higher area requirements. Our approach for assessing the impacts of threats on biodiversity is particularly useful when there is little time or data for detailed population viability analyses. © 2013 John Wiley & Sons Ltd.
Resumo:
The growth of APIs and Web services on the Internet, especially through larger enterprise systems increasingly being leveraged for Cloud and software-as-a-service opportunities, poses challenges for improving the efficiency of integration with these services. Interfaces of enterprise systems are typically larger, more complex and overloaded, with single operations having multiple data entities and parameter sets, supporting varying requests, and reflecting versioning across different system releases, compared to fine-grained operations of contemporary interfaces. We propose a technique to support the refactoring of service interfaces by deriving business entities and their relationships. In this paper, we focus on the behavioural aspects of service interfaces, aiming to discover the sequential dependencies of operations (otherwise known as protocol extraction) based on the entities and relationships derived. Specifically, we propose heuristics according to these relationships, and in turn, deriving permissible orders in which operations are invoked. As a result of this, service operations can be refactored on business entity CRUD lines, with explicit behavioural protocols as part of an interface definition. This supports flexible service discovery, composition and integration. A prototypical implementation and analysis of existing Web services, including those of commercial logistic systems (Fedex), are used to validate the algorithms proposed through the paper.