1000 resultados para Application virtualization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wireless Sensor Networks (WSNs) are generally used to collect information from the environment. The gathered data are delivered mainly to sinks or gateways that become the endpoints where applications can retrieve and process such data. However, applications would also expect from a WSN an event-driven operational model, so that they can be notified whenever occur some specific environmental changes instead of continuously analyzing the data provided periodically. In either operational model, WSNs represent a collection of interconnected objects, as outlined by the Internet of Things. Additionally, in order to fulfill the Internet of Things principles, Wireless Sensor Networks must have a virtual representation that allows indirect access to their resources, a model that should also include the virtualization of event sources in a WSN. Thus, in this paper a model for a virtual representation of event sources in a WSN is proposed. They are modeled as internet resources that are accessible by any internet application, following an Internet of Things approach. The model has been tested in a real implementation where a WSN has been deployed in an open neighborhood environment. Different event sources have been identified in the proposed scenario, and they have been represented following the proposed model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La iniciativa FIWARE ofrece un conjunto de APIs potentes que proporcionan la base para una innovación rápida y eficiente en el Internet del Futuro. Estas APIs son clave en el desarrollo de aplicaciones que usan tecnologías muy recientes e innovadoras, como el Internet de las cosas o la Gestión de Identidad en módulos de seguridad. Este documento presenta el desarrollo de una aplicación web de FIWARE usando componentes virtualizados en máquinas virtuales. La aplicación web está basada en “la fábrica de chocolate de Willy Wonka” como una implementación metafórica de una aplicación de seguridad e IoT en un entorno industrial. El componente principal es un servidor web en node.js que conecta con varios componentes de FIWARE, conocidos como “Generic Enablers”. La implementación está compuesta por dos módulos principales: el módulo de IoT y el módulo de seguridad. El módulo de IoT gestiona los sensores instalados por Willy Wonka en las salas de fábrica para monitorizar varios parámetros como, por ejemplo, la temperatura, la presión o la ocupación. El módulo de IoT crea y recibe información de contexto de los sensores virtuales. Esta información de contexto es gestionada y almacenada en un componente de FIWARE conocido como Context Broker. El Context Broker está basado en mecanismos de subscripciones que postean los datos de los sensores en la aplicación, en tiempo real y cuando estos cambian. La conexión con el cliente se produce mediante Web Sockets (socket.io). El módulo de seguridad gestiona las cuentas y la información de los usuarios, les autentica en la aplicación usando una cuenta de FIWARE y comprueba la autorización para acceder a distintos recursos. Distintos roles son creados con distintos permisos asignados. Por ejemplo, Willy Wonka puede tener acceso a todos los recursos, mientras que un Oompa Loopa encargado de la sala del chocolate solo deberías de tener acceso a los recursos de su sala. Este módulo está compuesto por tres componentes: el Gestor de Identidades, el PEP Proxy y el PDP AuthZForce. El gestor de identidades almacena las cuentas de FIWARE de los usuarios y permite la autenticación Single Sing On usando el protocolo OAuth2. Tras logearse, los usuarios autenticados reciben un token de autenticación que es usado después por el AuthZForce para comprobar el rol y permiso asociado del usuario. El PEP Proxy actúa como un servidor proxy que redirige las peticiones permitidas y bloquea las no autorizadas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation presents and evaluates a methodology for scheduling medical application workloads in virtualized computing environments. Such environments are being widely adopted by providers of "cloud computing" services. In the context of provisioning resources for medical applications, such environments allow users to deploy applications on distributed computing resources while keeping their data secure. Furthermore, higher level services that further abstract the infrastructure-related issues can be built on top of such infrastructures. For example, a medical imaging service can allow medical professionals to process their data in the cloud, easing them from the burden of having to deploy and manage these resources themselves. In this work, we focus on issues related to scheduling scientific workloads on virtualized environments. We build upon the knowledge base of traditional parallel job scheduling to address the specific case of medical applications while harnessing the benefits afforded by virtualization technology. To this end, we provide the following contributions: (1) An in-depth analysis of the execution characteristics of the target applications when run in virtualized environments. (2) A performance prediction methodology applicable to the target environment. (3) A scheduling algorithm that harnesses application knowledge and virtualization-related benefits to provide strong scheduling performance and quality of service guarantees. In the process of addressing these pertinent issues for our target user base (i.e. medical professionals and researchers), we provide insight that benefits a large community of scientific application users in industry and academia. Our execution time prediction and scheduling methodologies are implemented and evaluated on a real system running popular scientific applications. We find that we are able to predict the execution time of a number of these applications with an average error of 15%. Our scheduling methodology, which is tested with medical image processing workloads, is compared to that of two baseline scheduling solutions and we find that it outperforms them in terms of both the number of jobs processed and resource utilization by 20–30%, without violating any deadlines. We conclude that our solution is a viable approach to supporting the computational needs of medical users, even if the cloud computing paradigm is not widely adopted in its current form.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to shrinking budgets and new demands for technology, Scottsdale Community College (SCC) IT department needed an effective, sustainable solution that would provide ubiquitous access to technology for students, faculty, and staff, both on- and off-campus. This paper explores how SCC implemented a complete virtualized computing environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this thesis work is the study and creation of a harness modelling system. The model needs to simulate faithfully the physical behaviour of the harness, without any instability or incorrect movements. Since there are various simulation engines that try to model wiring's systems, this thesis work focused on the creation and test of a 3D environment with wiring and other objects through the PyChrono Simulation Engine. Fine-tuning of the simulation parameters were done during the test to achieve the most stable and correct simulation possible, but tests showed the intrinsic limits of the Engine regarding the collisions' detection between the various part of the cables, while collisions between cables and other physical objects such as pavement, walls and others are well managed by the simulator. Finally, the main purpose of the model is to be used to train Artificial Intelligence through Reinforcement Learnings techniques, so we designed, using OpenAI Gym APIs, the general structure of the learning environment, defining its basic functions and an initial framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A rapid, sensitive and specific method for quantifying propylthiouracil in human plasma using methylthiouracil as the internal standard (IS) is described. The analyte and the IS were extracted from plasma by liquid-liquid extraction using an organic solvent (ethyl acetate). The extracts were analyzed by high performance liquid chromatography coupled with electrospray tandem mass spectrometry (HPLC-MS/MS) in negative mode (ES-). Chromatography was performed using a Phenomenex Gemini C18 5μm analytical column (4.6mm×150mm i.d.) and a mobile phase consisting of methanol/water/acetonitrile (40/40/20, v/v/v)+0.1% of formic acid. For propylthiouracil and I.S., the optimized parameters of the declustering potential, collision energy and collision exit potential were -60 (V), -26 (eV) and -5 (V), respectively. The method had a chromatographic run time of 2.5min and a linear calibration curve over the range 20-5000ng/mL. The limit of quantification was 20ng/mL. The stability tests indicated no significant degradation. This HPLC-MS/MS procedure was used to assess the bioequivalence of two propylthiouracil 100mg tablet formulations in healthy volunteers of both sexes in fasted and fed state. The geometric mean and 90% confidence interval CI of Test/Reference percent ratios were, without and with food, respectively: 109.28% (103.63-115.25%) and 115.60% (109.03-122.58%) for Cmax, 103.31% (100.74-105.96%) and 103.40% (101.03-105.84) for AUClast. This method offers advantages over those previously reported, in terms of both a simple liquid-liquid extraction without clean-up procedures, as well as a faster run time (2.5min). The LOQ of 20ng/mL is well suited for pharmacokinetic studies. The assay performance results indicate that the method is precise and accurate enough for the routine determination of the propylthiouracil in human plasma. The test formulation with and without food was bioequivalent to reference formulation. Food administration increased the Tmax and decreased the bioavailability (Cmax and AUC).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although several treatments for tendon lesions have been proposed, successful tendon repair remains a great challenge for orthopedics, especially considering the high incidence of re-rupture of injured tendons. Our aim was to evaluate the pharmacological potential of Aloe vera on the content and arrangement of glycosaminoglycans (GAGs) during tendon healing, which was based on the effectiveness of A. vera on collagen organization previously observed by our group. In rats, a partial calcaneal tendon transection was performed with subsequent topical A. vera application at the injury site. The tendons were treated with A. vera ointment for 7 days and excised on the 7(th) , 14(th) , or 21(st) day post-surgery. Control rats received ointment without A. vera. A higher content of GAGs and a lower amount of dermatan sulfate were detected in the A. vera-treated group on the 14(th) day compared with the control. Also at 14 days post-surgery, a lower dichroic ratio in toluidine blue stained sections was observed in A. vera-treated tendons compared with the control. No differences were observed in the chondroitin-6-sulfate and TGF-β1 levels between the groups, and higher amount of non-collagenous proteins was detected in the A. vera-treated group on the 21(st) day, compared with the control group. No differences were observed in the number of fibroblasts, inflammatory cells and blood vessels between the groups. The application of A. vera during tendon healing modified the arrangement of GAGs and increased the content of GAGs and non-collagenous proteins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe herein a general method for the controlled Heck arylation of allylated malonates. Both electron-rich and electron-poor aryldiazonium salts were readily employed as the aryl-transfer agents in good yields and in high chemo-, regio-, and stereoselectivity without formation of decarboxylated byproducts. Reaction monitoring via ESI-MS was used to support the formation of chelated Pd species through the catalytic cycle. Additionally, some Heck adducts were successfully used in the total synthesis of pharmacologically active γ-lactones.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Streptococcus mutans is specifically suppressed by intensive treatment with chlorhexidine gel, but the time for recolonization and the effect on other oral bacteria are not totally clear. In this study, recolonization of mutans streptococci was evaluated in nine healthy adult volunteers, who were highly colonized with this microorganism. Stimulated saliva was collected before (baseline) and at 1, 7, 14, 21 and 28 days after application of 1% chlorhexidine gel on volunteers' teeth for two consecutive days. On each day, the gel was applied using disposable trays for 3 x 5 min with intervals of 5 min between each application. Saliva was plated on blood agar to determine total microorganisms (TM); on mitis salivarius agar to determine total streptococci (TS) and on mitis salivarius agar plus bacitracin to determine mutans streptococci (MS). Chlorhexidine was capable of reducing the counts of MS and the proportion of MS with regard to total microorganisms (%MS/TM) (p<0.05), but these values did not differ statistically from baseline (p>0.05) after 14 days for MS and 21 days for %MS/TM. The counts of TM and TS and the proportion of MS to total streptococci did not differ statistically from baseline (p>0.05) after chlorhexidine treatment. The results suggest that the effect of chlorhexidine gel treatment on suppression of mutans streptococci is limited to less than a month in highly colonized individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to estimate barite mortar attenuation curves using X-ray spectra weighted by a workload distribution. A semi-empirical model was used for the evaluation of transmission properties of this material. Since ambient dose equivalent, H(⁎)(10), is the radiation quantity adopted by IAEA for dose assessment, the variation of the H(⁎)(10) as a function of barite mortar thickness was calculated using primary experimental spectra. A CdTe detector was used for the measurement of these spectra. The resulting spectra were adopted for estimating the optimized thickness of protective barrier needed for shielding an area in an X-ray imaging facility.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To verify the methods used by the clinical trials that assessed the effect of tactile/kinesthetic stimulation on weight gain in preterm infants and highlight the similarities and differences among such studies. This review collected studies from two databases, PEDro and PubMed, in July of 2014, in addition to bibliographies. Two researchers assessed the relevant titles independently, and then chose which studies to read in full and include in this review by consensus. Clinical trials that studied tactile stimulation or massage therapy whether or not associated with kinesthetic stimulation of preterm infants; that assessed weight gain after the intervention; that had a control group and were composed in English, Portuguese, or Spanish were included. A total of 520 titles were found and 108 were selected for manuscript reading. Repeated studies were excluded, resulting in 40 different studies. Of these, 31 met all the inclusion criteria. There were many differences in the application of tactile/kinesthetic stimulation techniques among studies, which hindered the accurate reproduction of the procedure. Also, many studies did not describe the adverse events that occurred during stimulation, the course of action taken when such events occurred, and their effect on the outcome. These studies made a relevant contribution towards indicating tactile/kinesthetic stimulation as a promising tool. Nevertheless, there was no standard for application among them. Future studies should raise the level of methodological rigor and describe the adverse events. This may permit other researchers to be more aware of expected outcomes, and a standard technique could be established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mapping of elements in biological tissue by laser induced mass spectrometry is a fast growing analytical methodology in life sciences. This method provides a multitude of useful information of metal, nonmetal, metalloid and isotopic distribution at major, minor and trace concentration ranges, usually with a lateral resolution of 12-160 µm. Selected applications in medical research require an improved lateral resolution of laser induced mass spectrometric technique at the low micrometre scale and below. The present work demonstrates the applicability of a recently developed analytical methodology - laser microdissection associated to inductively coupled plasma mass spectrometry (LMD ICP-MS) - to obtain elemental images of different solid biological samples at high lateral resolution. LMD ICP-MS images of mouse brain tissue samples stained with uranium and native are shown, and a direct comparison of LMD and laser ablation (LA) ICP-MS imaging methodologies, in terms of elemental quantification, is performed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Efforts presented by the scientific community in recent years towards the development of numerous green chemical processes and wastewater treatment technologies are presented and discussed. In the light of these approaches, environmentally friendly technologies, as well as the key role played by the well-known advanced oxidation processes, are discussed, giving special attention to the ones comprising ozone applications. Fundamentals and applied aspects dealing with ozone technology and its application are also presented.