992 resultados para Crittografia, Data Encryption Standard, DES, Java, Logica, SAT solver, CNF, Crittoanalisi Logica


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Most basaltic volcanoes are affected by recurrent lateral instabilities during their evolution. Numerous factors have been shown to be involved in the process of flank destabilization occurring over long periods of time or by instantaneous failures. However, the role of these factors on the mechanical behaviour and stability of volcanic edifices is poorly-constrained as lateral failure usually results from the combined effects of several parameters. Our study focuses on the morphological and structural comparison of two end-member basaltic systems, La Reunion (Indian ocean, France) and Stromboli (southern Tyrrhenian sea, Italy). We showed that despite major differences on their volumes and geodynamic settings, both systems present some similarities as they are characterized by an intense intrusive activity along well-developed rift zones and recurrent phenomena of flank collapse during their evolution. Among the factors of instability, the examples of la Reunion and Stromboli evidence the major contribution of intrusive complexes to volcano growth and destruction as attested by field observations and the monitoring of these active volcanoes. Classical models consider the relationship between vertical intrusions of magma and flank movements along a preexisting sliding surface. A set of published and new field data from Piton des Neiges volcano (La Reunion) allowed us to recognize the role of subhorizontal intrusions in the process of flank instability and to characterize the geometry of both subvertical and subhorizontal intrusions within basaltic edifices. This study compares the results of numerical modelling of the displacements associated with high-angle and low-angle intrusions within basaltic volcanoes. We use a Mixed Boundary Element Method to investigate the mechanical response of an edifice to the injection of magmatic intrusions in different stress fields. Our results indicate that the anisotropy of the stress field favours the slip along the intrusions due to cointrusive shear stress, generating flank-scale displacements of the edifice, especially in the case of subhorizontal intrusions, capable of triggering large-scale flank collapses on basaltic volcanoes. Applications of our theoretical results to real cases of flank displacements on basaltic volcanoes (such as the 2007 eruptive crisis at La Reunion and Stromboli) revealed that the previous model of subvertical intrusions-related collapse is a likely mechanism affecting small-scale steeply-sloping basaltic volcanoes like Stromboli. Furthermore, our field study combined to modelling results confirms the importance of shallow-dipping intrusions in the morpho-structural evolution of large gently-sloping basaltic volcanoes like Piton de la Fournaise, Etna and Kilauea, with particular regards to flank instability, which can cause catastrophic tsunamis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

L'obiettivo di questo lavoro è effettuare un'analisi del modello di programmazione proposto da Android. L'attenzione verrà posta, in particolare, su quali meccanismi vengano forniti per la gestione di eventi asincroni generati dal sistema, allo scopo di notificare cambiamenti del contesto in cui si sta operando: dal modo in cui vengono intercettati, a come risulta possibile modificare il comportamento dell'applicazione, in reazione alle nuove informazioni acquisite. Si valuteranno gli elementi di novità introdotti nelle API di Android, in relazione ai classici mezzi disponibili nella programmazione standard in Java, atti a risolvere una nuova categoria di problematiche dovute alla natura context-aware delle applicazioni. Sarà effettuata anche un'analisi più generale della qualità del modello proposto, in termini di estensibilità e modularità del codice; per fare ciò, si prenderà in esame l'applicazione SMS Backup+ come caso di studio e si proporranno delle possibili estensioni per verificarne la fattibilità.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

RESTful services gained a lot of attention recently, even in the enterprise world, which is traditionally more web-service centric. Data centric RESfFul services, as previously mainly known in web environments, established themselves as a second paradigm complementing functional WSDL-based SOA. In the Internet of Things, and in particular when talking about sensor motes, the Constraint Application Protocol (CoAP) is currently in the focus of both research and industry. In the enterprise world a protocol called OData (Open Data Protocol) is becoming the future RESTful data access standard. To integrate sensor motes seamlessly into enterprise networks, an embedded OData implementation on top of CoAP is desirable, not requiring an intermediary gateway device. In this paper we introduce and evaluate an embedded OData implementation. We evaluate the OData protocol in terms of performance and energy consumption, considering different data encodings, and compare it to a pure CoAP implementation. We were able to demonstrate that the additional resources needed for an OData/JSON implementation are reasonable when aiming for enterprise interoperability, where OData is suggested to solve both the semantic and technical interoperability problems we have today when connecting systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Results from a search for supersymmetry in events with four or more leptons including electrons, muons and taus are presented. The analysis uses a data sample corresponding to 20.3  fb −1 of proton-proton collisions delivered by the Large Hadron Collider at s √ =8  TeV and recorded by the ATLAS detector. Signal regions are designed to target supersymmetric scenarios that can be either enriched in or depleted of events involving the production of a Z boson. No significant deviations are observed in data from standard model predictions and results are used to set upper limits on the event yields from processes beyond the standard model. Exclusion limits at the 95% confidence level on the masses of relevant supersymmetric particles are obtained. In R -parity-violating simplified models with decays of the lightest supersymmetric particle to electrons and muons, limits of 1350 and 750 GeV are placed on gluino and chargino masses, respectively. In R -parity-conserving simplified models with heavy neutralinos decaying to a massless lightest supersymmetric particle, heavy neutralino masses up to 620 GeV are excluded. Limits are also placed on other supersymmetric scenarios.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Importance In treatment-resistant schizophrenia, clozapine is considered the standard treatment. However, clozapine use has restrictions owing to its many adverse effects. Moreover, an increasing number of randomized clinical trials (RCTs) of other antipsychotics have been published. Objective To integrate all the randomized evidence from the available antipsychotics used for treatment-resistant schizophrenia by performing a network meta-analysis. Data Sources MEDLINE, EMBASE, Biosis, PsycINFO, PubMed, Cochrane Central Register of Controlled Trials, World Health Organization International Trial Registry, and clinicaltrials.gov were searched up to June 30, 2014. Study Selection At least 2 independent reviewers selected published and unpublished single- and double-blind RCTs in treatment-resistant schizophrenia (any study-defined criterion) that compared any antipsychotic (at any dose and in any form of administration) with another antipsychotic or placebo. Data Extraction and Synthesis At least 2 independent reviewers extracted all data into standard forms and assessed the quality of all included trials with the Cochrane Collaboration's risk-of-bias tool. Data were pooled using a random-effects model in a Bayesian setting. Main Outcomes and Measures The primary outcome was efficacy as measured by overall change in symptoms of schizophrenia. Secondary outcomes included change in positive and negative symptoms of schizophrenia, categorical response to treatment, dropouts for any reason and for inefficacy of treatment, and important adverse events. Results Forty blinded RCTs with 5172 unique participants (71.5% men; mean [SD] age, 38.8 [3.7] years) were included in the analysis. Few significant differences were found in all outcomes. In the primary outcome (reported as standardized mean difference; 95% credible interval), olanzapine was more effective than quetiapine (-0.29; -0.56 to -0.02), haloperidol (-0. 29; -0.44 to -0.13), and sertindole (-0.46; -0.80 to -0.06); clozapine was more effective than haloperidol (-0.22; -0.38 to -0.07) and sertindole (-0.40; -0.74 to -0.04); and risperidone was more effective than sertindole (-0.32; -0.63 to -0.01). A pattern of superiority for olanzapine, clozapine, and risperidone was seen in other efficacy outcomes, but results were not consistent and effect sizes were usually small. In addition, relatively few RCTs were available for antipsychotics other than clozapine, haloperidol, olanzapine, and risperidone. The most surprising finding was that clozapine was not significantly better than most other drugs. Conclusions and Relevance Insufficient evidence exists on which antipsychotic is more efficacious for patients with treatment-resistant schizophrenia, and blinded RCTs-in contrast to unblinded, randomized effectiveness studies-provide little evidence of the superiority of clozapine compared with other second-generation antipsychotics. Future clozapine studies with high doses and patients with extremely treatment-refractory schizophrenia might be most promising to change the current evidence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mineralogic, petrographic, and geochemical analyses of sediments recovered from two Leg 166 Ocean Drilling Program cores on the western slope of Great Bahama Bank (308 m and 437 m water depth) are used to characterize early marine diagenesis of these shallow-water, periplatform carbonates. The most pronounced diagenetic products are well-lithified intervals found almost exclusively in glacial lowstand deposits and interpreted to have formed at or near the seafloor (i.e., hardgrounds). Hardground cements are composed of high-Mg calcite (~14 mol% MgCO3), and exhibit textures typically associated with seafloor cementation. Geochemically, hardgrounds are characterized by increased d18O and Mg contents and decreased d13C, Sr, and Na contents relative to their less lithified counterparts. Despite being deposited in shallow waters that are supersaturated with the common carbonate minerals, it is clear that these sediments are also undergoing shallow subsurface diagenesis. Calculation of saturation states shows that pore waters become undersaturated with aragonite within the upper 10 m at both sites. Dissolution, and likely recrystallization, of metastable carbonates is manifested by increases in interstitial water Sr and Sr/Ca profiles with depth. We infer that the reduction in mineral saturation states and subsequent dissolution are being driven by the oxidation of organic matter in this Fe-poor carbonate system. Precipitation of burial diagenetic phases is indicated by the down-core appearance of dolomite and corresponding decrease in interstitial water Mg, and the presence of low-Mg calcite cements observed in scanning electron microscope photomicrographs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En la presente Tesis se ha llevado a cabo el contraste y desarrollo de metodologías que permitan mejorar el cálculo de las avenidas de proyecto y extrema empleadas en el cálculo de la seguridad hidrológica de las presas. En primer lugar se ha abordado el tema del cálculo de las leyes de frecuencia de caudales máximos y su extrapolación a altos periodos de retorno. Esta cuestión es de gran relevancia, ya que la adopción de estándares de seguridad hidrológica para las presas cada vez más exigentes, implica la utilización de periodos de retorno de diseño muy elevados cuya estimación conlleva una gran incertidumbre. Es importante, en consecuencia incorporar al cálculo de los caudales de diseño todas la técnicas disponibles para reducir dicha incertidumbre. Asimismo, es importante hacer una buena selección del modelo estadístico (función de distribución y procedimiento de ajuste) de tal forma que se garantice tanto su capacidad para describir el comportamiento de la muestra, como para predecir de manera robusta los cuantiles de alto periodo de retorno. De esta forma, se han realizado estudios a escala nacional con el objetivo de determinar el esquema de regionalización que ofrece mejores resultados para las características hidrológicas de las cuencas españolas, respecto a los caudales máximos anuales, teniendo en cuenta el numero de datos disponibles. La metodología utilizada parte de la identificación de regiones homogéneas, cuyos límites se han determinado teniendo en cuenta las características fisiográficas y climáticas de las cuencas, y la variabilidad de sus estadísticos, comprobando posteriormente su homogeneidad. A continuación, se ha seleccionado el modelo estadístico de caudales máximos anuales con un mejor comportamiento en las distintas zonas de la España peninsular, tanto para describir los datos de la muestra como para extrapolar a los periodos de retorno más altos. El proceso de selección se ha basado, entre otras cosas, en la generación sintética de series de datos mediante simulaciones de Monte Carlo, y el análisis estadístico del conjunto de resultados obtenido a partir del ajuste de funciones de distribución a estas series bajo distintas hipótesis. Posteriormente, se ha abordado el tema de la relación caudal-volumen y la definición de los hidrogramas de diseño en base a la misma, cuestión que puede ser de gran importancia en el caso de presas con grandes volúmenes de embalse. Sin embargo, los procedimientos de cálculo hidrológico aplicados habitualmente no tienen en cuenta la dependencia estadística entre ambas variables. En esta Tesis se ha desarrollado un procedimiento para caracterizar dicha dependencia estadística de una manera sencilla y robusta, representando la función de distribución conjunta del caudal punta y el volumen en base a la función de distribución marginal del caudal punta y la función de distribución condicionada del volumen respecto al caudal. Esta última se determina mediante una función de distribución log-normal, aplicando un procedimiento de ajuste regional. Se propone su aplicación práctica a través de un procedimiento de cálculo probabilístico basado en la generación estocástica de un número elevado de hidrogramas. La aplicación a la seguridad hidrológica de las presas de este procedimiento requiere interpretar correctamente el concepto de periodo de retorno aplicado a variables hidrológicas bivariadas. Para ello, se realiza una propuesta de interpretación de dicho concepto. El periodo de retorno se entiende como el inverso de la probabilidad de superar un determinado nivel de embalse. Al relacionar este periodo de retorno con las variables hidrológicas, el hidrograma de diseño de la presa deja de ser un único hidrograma para convertirse en una familia de hidrogramas que generan un mismo nivel máximo en el embalse, representados mediante una curva en el plano caudal volumen. Esta familia de hidrogramas de diseño depende de la propia presa a diseñar, variando las curvas caudal-volumen en función, por ejemplo, del volumen de embalse o la longitud del aliviadero. El procedimiento propuesto se ilustra mediante su aplicación a dos casos de estudio. Finalmente, se ha abordado el tema del cálculo de las avenidas estacionales, cuestión fundamental a la hora de establecer la explotación de la presa, y que puede serlo también para estudiar la seguridad hidrológica de presas existentes. Sin embargo, el cálculo de estas avenidas es complejo y no está del todo claro hoy en día, y los procedimientos de cálculo habitualmente utilizados pueden presentar ciertos problemas. El cálculo en base al método estadístico de series parciales, o de máximos sobre un umbral, puede ser una alternativa válida que permite resolver esos problemas en aquellos casos en que la generación de las avenidas en las distintas estaciones se deba a un mismo tipo de evento. Se ha realizado un estudio con objeto de verificar si es adecuada en España la hipótesis de homogeneidad estadística de los datos de caudal de avenida correspondientes a distintas estaciones del año. Asimismo, se han analizado los periodos estacionales para los que es más apropiado realizar el estudio, cuestión de gran relevancia para garantizar que los resultados sean correctos, y se ha desarrollado un procedimiento sencillo para determinar el umbral de selección de los datos de tal manera que se garantice su independencia, una de las principales dificultades en la aplicación práctica de la técnica de las series parciales. Por otra parte, la aplicación practica de las leyes de frecuencia estacionales requiere interpretar correctamente el concepto de periodo de retorno para el caso estacional. Se propone un criterio para determinar los periodos de retorno estacionales de forma coherente con el periodo de retorno anual y con una distribución adecuada de la probabilidad entre las distintas estaciones. Por último, se expone un procedimiento para el cálculo de los caudales estacionales, ilustrándolo mediante su aplicación a un caso de estudio. The compare and develop of a methodology in order to improve the extreme flow estimation for dam hydrologic security has been developed. First, the work has been focused on the adjustment of maximum peak flows distribution functions from which to extrapolate values for high return periods. This has become a major issue as the adoption of stricter standards on dam hydrologic security involves estimation of high design return periods which entails great uncertainty. Accordingly, it is important to incorporate all available techniques for the estimation of design peak flows in order to reduce this uncertainty. Selection of the statistical model (distribution function and adjustment method) is also important since its ability to describe the sample and to make solid predictions for high return periods quantiles must be guaranteed. In order to provide practical application of previous methodologies, studies have been developed on a national scale with the aim of determining a regionalization scheme which features best results in terms of annual maximum peak flows for hydrologic characteristics of Spanish basins taking into account the length of available data. Applied methodology starts with the delimitation of regions taking into account basin’s physiographic and climatic characteristics and the variability of their statistical properties, and continues with their homogeneity testing. Then, a statistical model for maximum annual peak flows is selected with the best behaviour for the different regions in peninsular Spain in terms of describing sample data and making solid predictions for high return periods. This selection has been based, among others, on synthetic data series generation using Monte Carlo simulations and statistical analysis of results from distribution functions adjustment following different hypothesis. Secondly, the work has been focused on the analysis of the relationship between peak flow and volume and how to define design flood hydrographs based on this relationship which can be highly important for large volume reservoirs. However, commonly used hydrologic procedures do not take statistical dependence between these variables into account. A simple and sound method for statistical dependence characterization has been developed by the representation of a joint distribution function of maximum peak flow and volume which is based on marginal distribution function of peak flow and conditional distribution function of volume for a given peak flow. The last one is determined by a regional adjustment procedure of a log-normal distribution function. Practical application is proposed by a probabilistic estimation procedure based on stochastic generation of a large number of hydrographs. The use of this procedure for dam hydrologic security requires a proper interpretation of the return period concept applied to bivariate hydrologic data. A standard is proposed in which it is understood as the inverse of the probability of exceeding a determined reservoir level. When relating return period and hydrological variables the only design flood hydrograph changes into a family of hydrographs which generate the same maximum reservoir level and that are represented by a curve in the peak flow-volume two-dimensional space. This family of design flood hydrographs depends on the dam characteristics as for example reservoir volume or spillway length. Two study cases illustrate the application of the developed methodology. Finally, the work has been focused on the calculation of seasonal floods which are essential when determining the reservoir operation and which can be also fundamental in terms of analysing the hydrologic security of existing reservoirs. However, seasonal flood calculation is complex and nowadays it is not totally clear. Calculation procedures commonly used may present certain problems. Statistical partial duration series, or peaks over threshold method, can be an alternative approach for their calculation that allow to solve problems encountered when the same type of event is responsible of floods in different seasons. A study has been developed to verify the hypothesis of statistical homogeneity of peak flows for different seasons in Spain. Appropriate seasonal periods have been analyzed which is highly relevant to guarantee correct results. In addition, a simple procedure has been defined to determine data selection threshold on a way that ensures its independency which is one of the main difficulties in practical application of partial series. Moreover, practical application of seasonal frequency laws requires a correct interpretation of the concept of seasonal return period. A standard is proposed in order to determine seasonal return periods coherently with the annual return period and with an adequate seasonal probability distribution. Finally a methodology is proposed to calculate seasonal peak flows. A study case illustrates the application of the proposed methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information about the comparative magnitude of the burden from various diseases and injuries is a critical input into building the evidence base for health policies and programmes. Such information should be based on a critical evaluation of all available epidemiological data using standard and comparable procedures across diseases and injuries, including information on the age at death and the incidence, duration and severity of cases who do not die prematurely from the disease. A summary measure, disability-adjusted life yrs (DALYs), has been developed to simultaneously measure the amount of disease burden due to premature mortality and the amount due to the nonfatal consequences of disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metabolic control is central to positive clinical outcome in patients with diabetes. Empowerment has been linked to metabolic control in this clinical group. The current study sought to determine key psychometric properties of the Chinese version of the Diabetes Empowerment Scale (C-DES) and to explore the relationship of the C-DES sub-scales to metabolic control in 189 patients with a diagnosis of diabetes. Confirmatory factor analysis established that the five sub-scales of the C-DES offered a highly satisfactory fit to the data. Furthermore, C-DES sub-scales were found to have generally acceptable internal consistency and divergent reliability. However, convergent reliability of C-DES sub-scales could not be established against metabolic control. It is concluded that future research needs to address ambiguities in the relationship between empowerment and metabolic control in order to afford patients an evidenced-based treatment package to assure optimal metabolic control.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new general linear model (GLM) beamformer method is described for processing magnetoencephalography (MEG) data. A standard nonlinear beamformer is used to determine the time course of neuronal activation for each point in a predefined source space. A Hilbert transform gives the envelope of oscillatory activity at each location in any chosen frequency band (not necessary in the case of sustained (DC) fields), enabling the general linear model to be applied and a volumetric T statistic image to be determined. The new method is illustrated by a two-source simulation (sustained field and 20 Hz) and is shown to provide accurate localization. The method is also shown to locate accurately the increasing and decreasing gamma activities to the temporal and frontal lobes, respectively, in the case of a scintillating scotoma. The new method brings the advantages of the general linear model to the analysis of MEG data and should prove useful for the localization of changing patterns of activity across all frequency ranges including DC (sustained fields). © 2004 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis investigates the pricing-to-market (PTM) behaviour of the UK export sector. Unlike previous studies, this study econometrically tests for seasonal unit roots in the export prices prior to estimating PTM behaviour. Prior studies have seasonally adjusted the data automatically. This study’s results show that monthly export prices contain very little seasonal unit roots implying that there is a loss of information in the data generating process of the series when estimating PTM using seasonally-adjusted data. Prior studies have also ignored the econometric properties of the data despite the existence of ARCH effects in such data. The standard approach has been to estimate PTM models using Ordinary Least Square (OLS). For this reason, both EGARCH and GJR-EGARCH (hereafter GJR) estimation methods are used to estimate both a standard and an Error Correction model (ECM) of PTM. The results indicate that PTM behaviour varies across UK sectors. The variables used in the PTM models are co-integrated and an ECM is a valid representation of pricing behaviour. The study also finds that the price adjustment is slower when the analysis is performed on real prices, i.e., data that are adjusted for inflation. There is strong evidence of auto-regressive condition heteroscedasticity (ARCH) effects – meaning that the PTM parameter estimates of prior studies have been ineffectively estimated. Surprisingly, there is very little evidence of asymmetry. This suggests that exporters appear to PTM at a relatively constant rate. This finding might also explain the failure of prior studies to find evidence of asymmetric exposure in foreign exchange (FX) rates. This study also provides a cross sectional analysis to explain the implications of the observed PTM of producers’ marginal cost, market share and product differentiation. The cross-sectional regressions are estimated using OLS, Generalised Method of Moment (GMM) and Logit estimations. Overall, the results suggest that market share affects PTM positively.Exporters with smaller market share are more likely to operate PTM. Alternatively, product differentiation is negatively associated with PTM. So industries with highly differentiated products are less likely to adjust their prices. However, marginal costs seem not to be significantly associated with PTM. Exporters perform PTM to limit the FX rate effect pass-through to their foreign customers, but they also avoided exploiting PTM to the full, since to do so can substantially reduce their profits.