897 resultados para end-to-end testing, javascript, application web, single-page application


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent emergence of intelligent agent technology and advances in information gathering have been the important steps forward in efficiently managing and using the vast amount of information now available on the Web to make informed decisions. There are, however, still many problems that need to be overcome in the information gathering research arena to enable the delivery of relevant information required by end users. Good decisions cannot be made without sufficient, timely, and correct information. Traditionally it is said that knowledge is power, however, nowadays sufficient, timely, and correct information is power. So gathering relevant information to meet user information needs is the crucial step for making good decisions. The ideal goal of information gathering is to obtain only the information that users need (no more and no less). However, the volume of information available, diversity formats of information, uncertainties of information, and distributed locations of information (e.g. World Wide Web) hinder the process of gathering the right information to meet the user needs. Specifically, two fundamental issues in regard to efficiency of information gathering are mismatch and overload. The mismatch means some information that meets user needs has not been gathered (or missed out), whereas, the overload means some gathered information is not what users need. Traditional information retrieval has been developed well in the past twenty years. The introduction of the Web has changed people's perceptions of information retrieval. Usually, the task of information retrieval is considered to have the function of leading the user to those documents that are relevant to his/her information needs. The similar function in information retrieval is to filter out the irrelevant documents (or called information filtering). Research into traditional information retrieval has provided many retrieval models and techniques to represent documents and queries. Nowadays, information is becoming highly distributed, and increasingly difficult to gather. On the other hand, people have found a lot of uncertainties that are contained in the user information needs. These motivate the need for research in agent-based information gathering. Agent-based information systems arise at this moment. In these kinds of systems, intelligent agents will get commitments from their users and act on the users behalf to gather the required information. They can easily retrieve the relevant information from highly distributed uncertain environments because of their merits of intelligent, autonomy and distribution. The current research for agent-based information gathering systems is divided into single agent gathering systems, and multi-agent gathering systems. In both research areas, there are still open problems to be solved so that agent-based information gathering systems can retrieve the uncertain information more effectively from the highly distributed environments. The aim of this thesis is to research the theoretical framework for intelligent agents to gather information from the Web. This research integrates the areas of information retrieval and intelligent agents. The specific research areas in this thesis are the development of an information filtering model for single agent systems, and the development of a dynamic belief model for information fusion for multi-agent systems. The research results are also supported by the construction of real information gathering agents (e.g., Job Agent) for the Internet to help users to gather useful information stored in Web sites. In such a framework, information gathering agents have abilities to describe (or learn) the user information needs, and act like users to retrieve, filter, and/or fuse the information. A rough set based information filtering model is developed to address the problem of overload. The new approach allows users to describe their information needs on user concept spaces rather than on document spaces, and it views a user information need as a rough set over the document space. The rough set decision theory is used to classify new documents into three regions: positive region, boundary region, and negative region. Two experiments are presented to verify this model, and it shows that the rough set based model provides an efficient approach to the overload problem. In this research, a dynamic belief model for information fusion in multi-agent environments is also developed. This model has a polynomial time complexity, and it has been proven that the fusion results are belief (mass) functions. By using this model, a collection fusion algorithm for information gathering agents is presented. The difficult problem for this research is the case where collections may be used by more than one agent. This algorithm, however, uses the technique of cooperation between agents, and provides a solution for this difficult problem in distributed information retrieval systems. This thesis presents the solutions to the theoretical problems in agent-based information gathering systems, including information filtering models, agent belief modeling, and collection fusions. It also presents solutions to some of the technical problems in agent-based information systems, such as document classification, the architecture for agent-based information gathering systems, and the decision in multiple agent environments. Such kinds of information gathering agents will gather relevant information from highly distributed uncertain environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: The purpose of this study was to evaluate the sealing ability of castor oil polymer (COP), mineral trioxide aggregate (MTA) and glass ionomer cement (GIC) as root-end filling materials. Forty-five single-rooted human teeth were cleaned and prepared using a step-back technique. The apical third of each root was resected perpendicularly to the long axis direction. All teeth were obturated with gutta-percha and an endodontic sealer. After, a root-end cavity with 1.25-mm depth was prepared using a diamond bur. The specimens were randomly divided into three experimental groups (n = 15), according to the root-end filling material used: G1) COP; G2) MTA; G3) GIC. The external surfaces of the specimens were covered with epoxy adhesive, except the root-end filling. The teeth were immersed in rhodamine B dye for 24 hours. Then, the roots were sectioned longitudinally and the linear dye penetration at the dentin/material interface was determined using a stereomicroscope. ANOVA and Tukey's tests were used to compare the three groups. The G1 group (COP) presented smaller dye penetration, statistically different than the G2 (MTA) and G3 (GIC) groups (p < 0.05). No statistically significant difference in microleakage was observed between G2 and G3 groups (p > 0.05). The results of this study indicate that the COP presented efficient sealing ability when used as a root-end filling material showing results significantly better than MTA and GIC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To evaluate the bond strength of indirect restorations to dentin using self-adhesive cements with and without the application of adhesive systems.Material and Methods: Seventy-two bovine incisors were used, in which the buccal surfaces were ground down to expose an area of dentin measuring a minimum of 4 x 4 mm. The indirect resin composite Resilab was used to make 72 blocks, which were cemented onto the dentin surface of the teeth and divided into 4 groups (n = 18): group 1: self-adhesive resin cement BiFix SE, applied according to manufacturer's recommendations; group 2: self-adhesive resin cement RelyX Unicem, used according to manufacturer's recommendations; group 3: etch-and-rinse Solobond M adhesive system + BiFix SE; group 4: etch-and-rinse Single Bond 2 adhesive system + RelyX Unicem. The specimens were sectioned into sticks and subjected to microtensile testing in a universal testing machine (EMIC DL-200MF). Data were subjected to one-way ANOVA and Tukey's test (alpha = 5%).Results: The mean values (+/- standard deviation) obtained for the groups were: group 1: 15.28 (+/- 8.17)(a), group 2: 14.60 (+/- 5.21)(a), group 3: 39.20 (+/- 9.98)(c), group 4: 27.59 (+/- 6.57)(b). Different letters indicate significant differences (ANOVA; p = 0.0000).Conclusion: The application of adhesive systems before self-adhesive cements significantly increased the bond strength to dentin. In group 2, RelyX Unicem associated with the adhesive system Single Bond 2 showed significantly lower mean tensile bond strengths than group 3 (BiFix SE associated with the etch-and-rinse Solobond M adhesive system).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The reaction of living anionic polymers with 2,2,5,5-tetramethyl-1-(3-bromopropyl)-1-aza-2,5- disilacyclopentane (1) was investigated using coupled thin layer chromatography and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. Structures of byproducts as well as the major product were determined. The anionic initiator having a protected primary amine functional group, 2,2,5,5-tetramethyl- 1-(3-lithiopropyl)-1-aza-2,5-disilacyclopentane (2), was synthesized using all-glass high-vacuum techniques, which allows the long-term stability of this initiator to be maintained. The use of 2 in the preparation of well-defined aliphatic primary amine R-end-functionalized polystyrene and poly(methyl methacrylate) was investigated. Primary amino R-end-functionalized poly(methyl methacrylate) can be obtained near-quantitatively by reacting 2 with 1,1-diphenylethylene in tetrahydrofuran at room temperature prior to polymerizing methyl methacrylate at -78 °C. When 2 is used to initiate styrene at room temperature in benzene, an additive such as N,N,N',N'- tetramethylethylenediamine is necessary to activate the polymerization. However, although the resulting polymers have narrow molecular weight distributions and well-controlled molecular weights, our mass spectra data suggest that the yield of primary amine α-end-functionalized polystyrene from these syntheses is very low. The majority of the products are methyl α-end-functionalized polystyrene.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Ph chromosome is the most frequent cytogenetic aberration associated with adult ALL and it represents the single most significant adverse prognostic marker. Despite imatinib has led to significant improvements in the treatment of patients with Ph+ ALL, in the majority of cases resistance developed quickly and disease progressed. Some mechanisms of resistance have been widely described but the full knowledge of contributing factors, driving both the disease and resistance, remains to be defined. The observation of rapid development of lymphoblastic leukemia in mice expressing altered Ikaros (Ik) isoforms represented the background of this study. Ikaros is a zinc finger transcription factor required for normal hemopoietic differentiation and proliferation, particularly in the lymphoid lineages. By means of alternative splicing, Ikaros encodes several proteins that differ in their abilities to bind to a consensus DNA-binding site. Shorter, DNA nonbinding isoforms exert a dominant negative effect, inhibiting the ability of longer heterodimer partners to bind DNA. The differential expression pattern of Ik isoforms in Ph+ ALL patients was analyzed in order to determine if molecular abnormalities involving the Ik gene could associate with resistance to imatinib and dasatinib. Bone marrow and peripheral blood samples from 46 adult patients (median age 55 yrs, 18-76) with Ph+ ALL at diagnosis and during treatment with imatinib (16 pts) or dasatinib (30 pts) were collected. We set up a fast, high-throughput method based on capillary electrophoresis technology to detect and quantify splice variants. 41% Ph+ ALL patients expressed high levels of the non DNA-binding dominant negative Ik6 isoform lacking critical N-terminal zinc-fingers which display abnormal subcellular compartmentalization pattern. Nuclear extracts from patients expressed Ik6 failed to bind DNA in mobility shift assay using a DNA probe containing an Ikaros-specific DNA binding sequence. In 59% Ph+ ALL patients there was the coexistence in the same PCR sample and at the same time of many splice variants corresponded to Ik1, Ik2, Ik4, Ik4A, Ik5A, Ik6, Ik6 and Ik8 isoforms. In these patients aberrant full-length Ikaros isoforms in Ph+ ALL characterized by a 60-bp insertion immediately downstream of exon 3 and a recurring 30-bp in-frame deletion at the end of exon 7 involving most frequently the Ik2, Ik4 isoforms were also identified. Both the insertion and deletion were due to the selection of alternative splice donor and acceptor sites. The molecular monitoring of minimal residual disease showed for the first time in vivo that the Ik6 expression strongly correlated with the BCR-ABL transcript levels suggesting that this alteration could depend on the Bcr-Abl activity. Patient-derived leukaemia cells expressed dominant-negative Ik6 at diagnosis and at the time of relapse, but never during remission. In order to mechanistically demonstrated whether in vitro the overexpression of Ik6 impairs the response to tyrosine kinase inhibitors (TKIs) and contributes to resistance, an imatinib-sensitive Ik6-negative Ph+ ALL cell line (SUP-B15) was transfected with the complete Ik6 DNA coding sequence. The expression of Ik6 strongly increased proliferation and inhibited apoptosis in TKI sensitive cells establishing a previously unknown link between specific molecular defects that involve the Ikaros gene and the resistance to TKIs in Ph+ ALL patients. Amplification and genomic sequence analysis of the exon splice junction regions showed the presence of 2 single nucleotide polymorphisms (SNPs): rs10251980 [A/G] in the exon2/3 splice junction and of rs10262731 [A/G] in the exon 7/8 splice junction in 50% and 36% of patients, respectively. A variant of the rs11329346 [-/C], in 16% of patients was also found. Other two different single nucleotide substitutions not recognized as SNP were observed. Some mutations were predicted by computational analyses (RESCUE approach) to alter cis-splicing elements. In conclusion, these findings demonstrated that the post-transcriptional regulation of alternative splicing of Ikaros gene is defective in the majority of Ph+ ALL patients treated with TKIs. The overexpression of Ik6 blocking B-cell differentiation could contribute to resistance opening a time frame, during which leukaemia cells acquire secondary transforming events that confer definitive resistance to imatinib and dasatinib.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In-situ Messung großer Hydrometeore mit Hilfe derIn-line-Holographie Diese Dissertation beschreibt die Entwicklung und Erprobungeiner Apparatur zur holographischen In-situ-Messung großerHydrometeore (HODAR). Dazu wird eine dreidimensionaleMomentaufnahme eines etwa 500 dm³ großes Meßvolumen derfreien Atmosphäre mittels der In-line-Holographieaufgezeichnet. In dieser Aufnahme kann die Größe und Formeinzelner Hydrometeore, aber auch ihre Position imMeßvolumen ausgemessen werden. Daraus sind Größen- undAbstandsverteilungen der Hydrometeore zu bestimmen. MitHilfe von Doppelbelichtungen lassen sich zusätzlich auchihre Geschwindigkeiten ermitteln.Im Verlauf dieser Arbeit werden zunächst die Hydrometeorevorgestellt. Die theoretischen Möglichkeiten einer Apparaturzur In-situ-Messung werden aus den Eigenschaften desholographischen Bildes entwickelt. Anschließend wird derverwirklichte Aufbau des HODAR erläutert. Kalibrierung undTest, sowie Messungen, die die Fähigkeiten des HODAR unterBeweis stellen, werden beschrieben.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this work is to find a methodology in order to make possible the recycling of fines (0 - 4 mm) in the Construction and Demolition Waste (CDW) process. At the moment this fraction is a not desired by-product: it has high contaminant content, it has to be separated from the coarse fraction, because of its high water absorption which can affect the properties of the concrete. In fact, in some countries the use of fines recycled aggregates is highly restricted or even banned. This work is placed inside the European project C2CA (from Concrete to Cement and Clean Aggregates) and it has been held in the Faculty of Civil Engineering and Geosciences of the Technical University of Delft, in particular, in the laboratory of Resources And Recycling. This research proposes some procedures in order to close the loop of the entire recycling process. After the classification done by ADR (Advanced Dry Recovery) the two fractions "airknife" and "rotor" (that together constitute the fraction 0 - 4 mm) are inserted in a new machine that works at high temperatures. The temperatures analysed in this research are 600 °C and 750 °C, cause at that temperature it is supposed that the cement bounds become very weak. The final goal is "to clean" the coarse fraction (0,250 - 4 mm) from the cement still attached to the sand and try to concentrate the cement paste in the fraction 0 - 0,250 mm. This new set-up is able to dry the material in very few seconds, divide it into two fractions (the coarse one and the fine one) thanks to the air and increase the amount of fines (0 - 0,250 mm) promoting the attrition between the particles through a vibration device. The coarse fraction is then processed in a ball mill in order to improve the result and reach the final goal. Thanks to the high temperature it is possible to markedly reduce the milling time. The sand 0 - 2 mm, after being heated and milled is used to replace 100% of norm sand in mortar production. The results are very promising: the mortar made with recycled sand reaches an early strength, in fact the increment with respect to the mortar made with norm sand is 20% after three days and 7% after seven days. With this research it has been demonstrated that once the temperature is increased it is possible to obtain a clean coarse fraction (0,250 - 4 mm), free from cement paste that is concentrated in the fine fraction 0 - 0,250 mm. The milling time and the drying time can be largely reduced. The recycled sand shows better performance in terms of mechanical properties with respect to the natural one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dentinal cracks are occasionally observed at the cut root face after root-end resection in apical surgery. The objective of this ex vivo study was to evaluate and compare the efficiency of visual aids to identify root-end dentinal cracks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The importance of E2F transcription factors in the processes of proliferation and apoptosis are well established. E2F1, but not other E2F family members, is also phosphorylated and stabilized in response to various forms of DNA damage to regulate the expression of cell cycle and pro-apoptotic genes. E2F1 also relocalizes and forms foci at sites of DNA double-strand breaks but the function of E2F1 at sites of damage is still unknown. Here I reveal that E2F1 deficiency leads to increased spontaneous DNA break and impaired recovery following exposure to ionizing radiation. In response to DNA double-strand breaks, NBS1 phosphorylation and foci formation are defective in cells lacking E2F1, but NBS1 expression levels are unaffected. Moreover, it was observed that an association between NBS1 and E2F1 is increased in response to DNA damage, suggesting that E2F1 may promote NBS1 foci formation through a direct or indirect interaction at sites of DNA breaks. E2F1 deficient cells also display impaired foci formation of RPA and Rad51, which suggests a defect in DNA end resection and formation of single-stranded DNA at DNA double-strand breaks. I also found E2F1 status affects foci formation of the histone acetyltransferase GCN5 in response to DNA double-strand breaks. E2F1 is phosphorylated at serine 31 (serine 29 in mouse) by the ATM kinase as part of the DNA damage response. To investigate the importance of this event, our lab developed an E2F1 serine 29 mutant mouse model. I find that E2F1 serine 29 mutant cells show loss of E2F1 foci formation in response to DNA double-strand breaks. Furthermore, DNA repair and NBS1 foci formation are impaired in E2f1S29A/S29A cells. Taken together, my results indicate novel roles for E2F1 in the DNA damage response, which may directly promote DNA repair and genome maintenance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Detrital K-feldspars and muscovites from Ocean Drilling Program Leg 116 cores that have depositional ages from 0 to 18 Ma have been dated by the 40Ar/39Ar technique. Four to thirteen individual K-feldspars have been dated from seven stratigraphic levels, each of which have a very large range, up to 1660 Ma. At each level investigated, at least one K-feldspar yielded an age minimum which is, within uncertainty, identical to the age of deposition. One to twelve single muscovite crystals from each of six levels have also been studied. The range of muscovite ages is less than that of the K-feldspars and, with one exception, reveal only a 20-Ma spread in ages. As with the K-feldspars, each level investigated contains muscovites with mineral ages essentially identical to depositional ages. These results indicate that a significant portion of the material in the Bengal Fan is first-cycle detritus derived from the Himalayas. Therefore, the significant proportion of sediment deposited in the distal fan in the early to mid Miocene can be ascribed to a significant pulse of uplift and erosion in the collision zone. Moreover, these data indicate that during the entire Neogene, some portion of the Himalayan orogen was experiencing rapid erosion (<= uplift). The lack of granulite facies rocks in the eastern Himalayas and Tibetan Plateau suggests that very rapid uplift must have been distributed in brief pulses in different places in the mountain belt. We suggest that the great majority of the crystals with young apparent ages have been derived from the southern slope of the Himalayas, predominantly from near the main central thrust zone. These data provide further evidence against tectonic models in which the Himalayas and Tibetan plateaus are uplifted either uniformly during the past 40 m.y. or mostly within the last 2 to 5 m.y.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pteropods are an important component of the zooplankton community and hence of the food web in the Fram Strait. They have a calcareous (aragonite) shell and are thus sensitive in particular to the effects of the increasing CO2 concentration in the atmosphere and the associated changes of pH and temperature in the ocean. In the eastern Fram Strait, two species of thecosome pteropods occur, the cold water-adapted Limacina helicina and the subarctic boreal species Limacina retroversa. Both species were regularly observed in year-round moored sediment traps at ~ 200-300 m depth in the deep-sea long-term observatory HAUSGARTEN (79°N, 4°E). The flux of all pteropods found in the trap samples varied from < 20 to ~ 870 specimen/m**2/d in the years 2000-2009, being lower during the period 2000-2006. At the beginning of the time series, pteropods were dominated by the cold-water-adapted L. helicina, whereas the subarctic boreal L. retroversa was only occasionally found in large quantities (> 50/m**2/d). This picture completely changed after 2005/6 when L. retroversa became dominant and total pteropod numbers in the trap samples increased significantly. Concomitant to this shift in species composition, a warming event occurred in 2005/6 and persisted until the end of the study in 2009, despite a slight cooling in the upper water layer after 2007/8. Sedimentation of pteropods showed a strong seasonality, with elevated fluxes of L. helicina from August to November. Numbers of L. retroversa usually increased later, during September/October, with a maximum at the end of the season during December/January. In terms of carbonate export, aragonite shells of pteropods contributed with 11-77% to the annual total CaCO3 flux in Fram Strait. The highest share was found in the period 2007 to 2009, predominantly during sedimentation events at the end of the year. Results obtained by sediment traps occasionally installed on a benthic lander revealed that pteropods also arrive at the seafloor (~ 2550 m) almost simultaneous with their occurrence in the shallower traps. This indicates a rapid downward transport of calcareous shells, which provides food particles for the deep-sea benthos during winter when other production in the upper water column is shut down. The results of our study highlight the great importance of pteropods for the biological carbon pump as well as for the carbonate system in Fram Strait at present, and indicate modifications within the zooplankton community. The results further emphasize the importance of long-term investigation to disclose such changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multi-user videoconferencing systems offer communication between more than two users, who are able to interact through their webcams, microphones and other components. The use of these systems has been increased recently due to, on the one hand, improvements in Internet access, networks of companies, universities and houses, whose available bandwidth has been increased whilst the delay in sending and receiving packets has decreased. On the other hand, the advent of Rich Internet Applications (RIA) means that a large part of web application logic and control has started to be implemented on the web browsers. This has allowed developers to create web applications with a level of complexity comparable to traditional desktop applications, running on top of the Operating Systems. More recently the use of Cloud Computing systems has improved application scalability and involves a reduction in the price of backend systems. This offers the possibility of implementing web services on the Internet with no need to spend a lot of money when deploying infrastructures and resources, both hardware and software. Nevertheless there are not many initiatives that aim to implement videoconferencing systems taking advantage of Cloud systems. This dissertation proposes a set of techniques, interfaces and algorithms for the implementation of videoconferencing systems in public and private Cloud Computing infrastructures. The mechanisms proposed here are based on the implementation of a basic videoconferencing system that runs on the web browser without any previous installation requirements. To this end, the development of this thesis starts from a RIA application with current technologies that allow users to access their webcams and microphones from the browser, and to send captured data through their Internet connections. Furthermore interfaces have been implemented to allow end users to participate in videoconferencing rooms that are managed in different Cloud provider servers. To do so this dissertation starts from the results obtained from the previous techniques and backend resources were implemented in the Cloud. A traditional videoconferencing service which was implemented in the department was modified to meet typical Cloud Computing infrastructure requirements. This allowed us to validate whether Cloud Computing public infrastructures are suitable for the traffic generated by this kind of system. This analysis focused on the network level and processing capacity and stability of the Cloud Computing systems. In order to improve this validation several other general considerations were taken in order to cover more cases, such as multimedia data processing in the Cloud, as research activity has increased in this area in recent years. The last stage of this dissertation is the design of a new methodology to implement these kinds of applications in hybrid clouds reducing the cost of videoconferencing systems. Finally, this dissertation opens up a discussion about the conclusions obtained throughout this study, resulting in useful information from the different stages of the implementation of videoconferencing systems in Cloud Computing systems. RESUMEN Los sistemas de videoconferencia multiusuario permiten la comunicación entre más de dos usuarios que pueden interactuar a través de cámaras de video, micrófonos y otros elementos. En los últimos años el uso de estos sistemas se ha visto incrementado gracias, por un lado, a la mejora de las redes de acceso en las conexiones a Internet en empresas, universidades y viviendas, que han visto un aumento del ancho de banda disponible en dichas conexiones y una disminución en el retardo experimentado por los datos enviados y recibidos. Por otro lado también ayudó la aparación de las Aplicaciones Ricas de Internet (RIA) con las que gran parte de la lógica y del control de las aplicaciones web comenzó a ejecutarse en los mismos navegadores. Esto permitió a los desarrolladores la creación de aplicaciones web cuya complejidad podía compararse con la de las tradicionales aplicaciones de escritorio, ejecutadas directamente por los sistemas operativos. Más recientemente el uso de sistemas de Cloud Computing ha mejorado la escalabilidad y el abaratamiento de los costes para sistemas de backend, ofreciendo la posibilidad de implementar servicios Web en Internet sin la necesidad de grandes desembolsos iniciales en las áreas de infraestructuras y recursos tanto hardware como software. Sin embargo no existen aún muchas iniciativas con el objetivo de realizar sistemas de videoconferencia que aprovechen las ventajas del Cloud. Esta tesis doctoral propone un conjunto de técnicas, interfaces y algoritmos para la implentación de sistemas de videoconferencia en infraestructuras tanto públicas como privadas de Cloud Computing. Las técnicas propuestas en la tesis se basan en la realización de un servicio básico de videoconferencia que se ejecuta directamente en el navegador sin la necesidad de instalar ningún tipo de aplicación de escritorio. Para ello el desarrollo de esta tesis parte de una aplicación RIA con tecnologías que hoy en día permiten acceder a la cámara y al micrófono directamente desde el navegador, y enviar los datos que capturan a través de la conexión de Internet. Además se han implementado interfaces que permiten a usuarios finales la participación en salas de videoconferencia que se ejecutan en servidores de proveedores de Cloud. Para ello se partió de los resultados obtenidos en las técnicas anteriores de ejecución de aplicaciones en el navegador y se implementaron los recursos de backend en la nube. Además se modificó un servicio ya existente implementado en el departamento para adaptarlo a los requisitos típicos de las infraestructuras de Cloud Computing. Alcanzado este punto se procedió a analizar si las infraestructuras propias de los proveedores públicos de Cloud Computing podrían soportar el tráfico generado por los sistemas que se habían adaptado. Este análisis se centró tanto a nivel de red como a nivel de capacidad de procesamiento y estabilidad de los sistemas. Para los pasos de análisis y validación de los sistemas Cloud se tomaron consideraciones más generales para abarcar casos como el procesamiento de datos multimedia en la nube, campo en el que comienza a haber bastante investigación en los últimos años. Como último paso se ideó una metodología de implementación de este tipo de aplicaciones para que fuera posible abaratar los costes de los sistemas de videoconferencia haciendo uso de clouds híbridos. Finalmente en la tesis se abre una discusión sobre las conclusiones obtenidas a lo largo de este amplio estudio, obteniendo resultados útiles en las distintas etapas de implementación de los sistemas de videoconferencia en la nube.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation introduces an approach to generate tests to test fail-safe behavior for web applications. We apply the approach to a commercial web application. We build models for both behavioral and mitigation requirements. We create mitigation tests from an existing functional black box test suite by determining failure type and points of failure in the test suite and weaving required mitigation based on weaving rules to generate a test suite that tests proper mitigation of failures. A genetic algorithm (GA) is used to determine points of failure and type of failure that needs to be tested. Mitigation test paths are woven into the behavioral test at the point of failure based on failure specific weaving rules. A simulator was developed to evaluate choice of parameters for the genetic algorithm. We showed how to tune the fitness function and performed tuning experiments for GA to determine what values to use for exploration weight and prospecting weight. We found that higher defect densities make prospecting and mining more successful, while lower mitigation defect densities need more exploration. We compare efficiency and effectiveness of the approach. First, the GA approach is compared to random selection. The results show that the GA performance was better than random selection and that the approach was robust when the search space increased. Second, we compare the GA against four coverage criteria. The results of comparison show that test requirements generated by a genetic algorithm (GA) are more efficient than three of the four coverage criteria for large search spaces. They are equally effective. For small search spaces, the genetic algorithm is less effective than three of the four coverage criteria. The fourth coverage criteria is too weak and unable to find all defects in almost all cases. We also present a large case study of a mortgage system at one of our industrial partners and show how we formalize the approach. We evaluate the use of a GA to create test requirements. The evaluation includes choice of initial population, multiplicity of runs and a discussion of the cost of evaluating fitness. Finally, we build a selective regression testing approach based on types of changes (add, delete, or modify) that could occur in the behavioral model, the fault model, the mitigation models, the weaving rules, and the state-event matrix. We provide a systematic method by showing the formalization steps for each type of change to the various models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since Vladimir Putin returned to the Kremlin as President in May 2012, the Russian system of power has become increasingly authoritarian, and has evolved towards a model of extremely personalised rule that derives its legitimacy from aggressive decisions in internal and foreign policy, escalates the use of force, and interferes increasingly assertively in the spheres of politics, history, ideology or even public morals. Putin’s power now rests on charismatic legitimacy to a much greater extent than it did during his first two presidential terms; currently the President is presented not only as an effective leader, but also as the sole guarantor of Russia’s stability and integrity. After 15 years of Putin’s rule, Russia’s economic model based on revenue from energy resources has exhausted its potential, and the country has no new model that could ensure continued growth for the economy. The Putinist system of power is starting to show symptoms of agony – it has been unable to generate new development projects, and has been compensating for its ongoing degradation by escalating repression and the use of force. However, this is not equivalent to its imminent collapse.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a communication to the Parliament and the Council entitled “Towards a modern, more European copyright framework” and dated 9 December 2015,1 the European Commission confirmed its intention to progressively remove the main obstacles to the functioning of the Digital Single Market for copyrighted works. The first step of this long-term plan, which was first announced in Juncker’s Political Guidelines2 and the Communication on “A Digital Single Market strategy for Europe”,3 is a proposal for a regulation aimed at ensuring the so-called ‘cross-border portability’ of online services giving access to content such as music, games, films and sporting events.4 In a nutshell, the proposed regulation seeks to enable consumers with legal access to such online content services in their country of residence to use the same services also when they are in another member state for a limited period of time. On the one hand, this legislative proposal has the full potential to resolve the (limited) issue of portability, which stems from the national dimension of copyright and the persisting territorial licensing and distribution of copyright content.5 On the other hand, as this commentary shows, the ambiguity of certain important provisions in the proposed regulation might affect its scope and effectiveness and contribute to the erosion of the principle of copyright territoriality.