883 resultados para crittografia, mixnet, EasyCrypt, game-based proofs,sequence of games, computation-aided proofs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: The genus Olea (Oleaceae) includes approx. 40 taxa of evergreen shrubs and trees classified in three subgenera, Olea, Paniculatae and Tetrapilus, the first of which has two sections (Olea and Ligustroides). Olive trees (the O. europaea complex) have been the subject of intensive research, whereas little is known about the phylogenetic relationships among the other species. To clarify the biogeographical history of this group, a molecular analysis of Olea and related genera of Oleaceae is thus necessary. METHODS: A phylogeny was built of Olea and related genera based on sequences of the nuclear ribosomal internal transcribed spacer-1 and four plastid regions. Lineage divergence and the evolution of abaxial peltate scales, the latter character linked to drought adaptation, were dated using a Bayesian method. KEY RESULTS: Olea is polyphyletic, with O. ambrensis and subgenus Tetrapilus not sharing a most recent common ancestor with the main Olea clade. Partial incongruence between nuclear and plastid phylogenetic reconstructions suggests a reticulation process in the evolution of subgenus Olea. Estimates of divergence times for major groups of Olea during the Tertiary were obtained. CONCLUSIONS: This study indicates the necessity of revising current taxonomic boundaries in Olea. The results also suggest that main lines of evolution were promoted by major Tertiary climatic shifts: (1) the split between subgenera Olea and Paniculatae appears to have taken place at the Miocene-Oligocene boundary; (2) the separation of sections Ligustroides and Olea may have occurred during the Early Miocene following the Mi-1 glaciation; and (3) the diversification within these sections (and the origin of dense abaxial indumentum in section Olea) was concomitant with the aridification of Africa in the Late Miocene.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

US Geological Survey (USGS) based elevation data are the most commonly used data source for highway hydraulic analysis; however, due to the vertical accuracy of USGS-based elevation data, USGS data may be too “coarse” to adequately describe surface profiles of watershed areas or drainage patterns. Additionally hydraulic design requires delineation of much smaller drainage areas (watersheds) than other hydrologic applications, such as environmental, ecological, and water resource management. This research study investigated whether higher resolution LIDAR based surface models would provide better delineation of watersheds and drainage patterns as compared to surface models created from standard USGS-based elevation data. Differences in runoff values were the metric used to compare the data sets. The two data sets were compared for a pilot study area along the Iowa 1 corridor between Iowa City and Mount Vernon. Given the limited breadth of the analysis corridor, areas of particular emphasis were the location of drainage area boundaries and flow patterns parallel to and intersecting the road cross section. Traditional highway hydrology does not appear to be significantly impacted, or benefited, by the increased terrain detail that LIDAR provided for the study area. In fact, hydrologic outputs, such as streams and watersheds, may be too sensitive to the increased horizontal resolution and/or errors in the data set. However, a true comparison of LIDAR and USGS-based data sets of equal size and encompassing entire drainage areas could not be performed in this study. Differences may also result in areas with much steeper slopes or significant changes in terrain. LIDAR may provide possibly valuable detail in areas of modified terrain, such as roads. Better representations of channel and terrain detail in the vicinity of the roadway may be useful in modeling problem drainage areas and evaluating structural surety during and after significant storm events. Furthermore, LIDAR may be used to verify the intended/expected drainage patterns at newly constructed highways. LIDAR will likely provide the greatest benefit for highway projects in flood plains and areas with relatively flat terrain where slight changes in terrain may have a significant impact on drainage patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACTThe pollution of air, soil and water by heavy metals through anthropogenic activities is an object of numerous environmental studies since long times. A number of natural processes, such as volcanic activity, hydrothermal fluid circulation and weathering of metal-rich deposits may lead to an additional and potentially important input and accumulation of heavy metals in the environment. In the Swiss and French Jura Mountains, anomalous high cadmium (Cd) concentrations (up to 16 ppm) in certain soils are related to the presence of underlying Cd-enriched (up to 21 ppm) carbonate rocks of Middle to Late Jurassic age. The aim of this study is to understand the processes controlling Cd incorporation into carbonate rocks of Middle and Late Jurassic age and to reconstruct the sedimentary and environmental conditions, which have led to Cd enrichments in these sedimentary rocks.Cd concentrations in studied hemipelagic sections in France vary between 0.1 and 0.5 ppm (mean 0.15 ppm). Trace-element behavior and high Mn concentrations suggest that sediment accumulation occurred in a well-oxygenated environment. Increases in Cd contents in the bulk-rock carbonate sediments may be related to increases in surface-water productivity under oxic conditions and important remineralization of organic matter within the water column. In platform settings preserved in the Swiss Jura Mountains, no correlation is observed between Cd contents and evolution of environmental conditions. Cd concentrations in these platform sections are often below the detection limit, with isolated peaks of up to 21 ppm. These important Cd enrichments are associated with peaks in Zn concentrations and are present in carbonate rocks independently of facies and age. The high Cd contents in these shallow-water carbonate rocks are partly related to the presence of disseminated, Cd-rich (up to 1.8%), sphalerite (ZnS) mineralization. The basement rocks are considered to be the source of metals for sulfide mineralization in the overlying Jurassic strata, as the sphalerite Pb isotope pattern is comparable to that of granite rocks from the nearby southern Black Forest crystalline massif. The Rb-Sr ages of sphalerite samples indicate that a main phase of sphalerite formation occurred near the boundary between the late Middle and early Late Jurassic, at around 162 Ma, as a result of enhanced tectonic and hydrothermal activity in Europe, related to the opening of the Central Atlantic and to the tectonic/thermal subsidence during spreading of the Alpine Tethys. I therefore propose to use unusually high Cd concentrations in carbonates as a tracer of tectonic activity in the Jura Mountains area, especially in the case when important enrichments in Zn co-occur. Paleoproductivity reconstructions based on records of authigenic Cd may be compromised not only by post-depositional redistribution, but also by incorporation of additional Cd from hydrothermal solutions circulating in the rock.The circulation of metal-rich hydrothermal fluids through the sediment sequence, in addition to specific environmental conditions during sedimentation, contributes to the incorporation of Cd into the carbonate rocks. However, only hydrothermal activity has led to the unusually high concentrations of Cd in carbonate rocks of Bajocian-Oxfordian age, through the formation of sphalerite mineralization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Sunitinib (VEGFR/PDGFR inhibitor) and everolimus (mTOR inhibitor) are both approved for advanced renal cell carcinoma (RCC) as first-line and second-line therapy, respectively. In the clinics, sunitinib treatment is limited by the emergence of acquired resistance, leading to a switch to second-line treatment at progression, often based on everolimus. No data have been yet generated on programmed alternating sequential strategies combining alternative use of sunitinib and everolimus before progression. Such strategy is expected to delay the emergence of acquired resistance and improve tumour control. The aim of our study was to assess the changes in tumours induced by three different sequences administration of sunitinib and everolimus. METHODS: In human Caki-1 RCC xenograft model, sunitinib was alternated with everolimus every week, every 2 weeks, or every 3 weeks. Effects on necrosis, hypoxia, angiogenesis, and EMT status were assessed by immunohisochemistry and immunofluorescence. RESULTS: Sunitinib and everolimus programmed sequential regimens before progression yielded longer median time to tumour progression than sunitinib and everolimus monotherapies. In each group of treatment, tumour growth control was associated with inhibition of mTOR pathway and changes from a mesenchymal towards an epithelial phenotype, with a decrease in vimentin and an increase in E-cadherin expression. The sequential combinations of these two agents in a RCC mouse clinical trial induced antiangiogenic effects, leading to tumour necrosis. CONCLUSIONS: In summary, our study showed that alternate sequence of sunitinib and everolimus mitigated the development of mesenchymal phenotype compared with sunitinib as single agent.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complete amino acid sequence of mature C8 beta has been derived from the DNA sequence of a cDNA clone identified by expression screening of a human liver cDNA library. Comparison with the amino acid sequence of C9 shows an overall homology with few deletions and insertions. In particular, the cysteine-rich domains and membrane-inserting regions of C9 are well conserved. These findings are discussed in relation to a possible mechanism of membrane attack complex formation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this thesis is to present a new approach to the lossy compression of multispectral images. Proposed algorithm is based on combination of quantization and clustering. Clustering was investigated for compression of the spatial dimension and the vector quantization was applied for spectral dimension compression. Presenting algo¬rithms proposes to compress multispectral images in two stages. During the first stage we define the classes' etalons, another words to each uniform areas are located inside the image the number of class is given. And if there are the pixels are not yet assigned to some of the clusters then it doing during the second; pass and assign to the closest eta¬lons. Finally a compressed image is represented with a flat index image pointing to a codebook with etalons. The decompression stage is instant too. The proposed method described in this paper has been tested on different satellite multispectral images from different resources. The numerical results and illustrative examples of the method are represented too.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vertebral fracture assessments (VFAs) using dual-energy X-ray absorptiometry increase vertebral fracture detection in clinical practice and are highly reproducible. Measures of reproducibility are dependent on the frequency and distribution of the event. The aim of this study was to compare 2 reproducibility measures, reliability and agreement, in VFA readings in both a population-based and a clinical cohort. We measured agreement and reliability by uniform kappa and Cohen's kappa for vertebral reading and fracture identification: 360 VFAs from a population-based cohort and 85 from a clinical cohort. In the population-based cohort, 12% of vertebrae were unreadable. Vertebral fracture prevalence ranged from 3% to 4%. Inter-reader and intrareader reliability with Cohen's kappa was fair to good (0.35-0.71 and 0.36-0.74, respectively), with good inter-reader and intrareader agreement by uniform kappa (0.74-0.98 and 0.76-0.99, respectively). In the clinical cohort, 15% of vertebrae were unreadable, and vertebral fracture prevalence ranged from 7.6% to 8.1%. Inter-reader reliability was moderate to good (0.43-0.71), and the agreement was good (0.68-0.91). In clinical situations, the levels of reproducibility measured by the 2 kappa statistics are concordant, so that either could be used to measure agreement and reliability. However, if events are rare, as in a population-based cohort, we recommend evaluating reproducibility using the uniform kappa, as Cohen's kappa may be less accurate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, Business Model Canvas design has evolved from being a paper-based activity to one that involves the use of dedicated computer-aided business model design tools. We propose a set of guidelines to help design more coherent business models. When combined with functionalities offered by CAD tools, they show great potential to improve business model design as an ongoing activity. However, in order to create complex solutions, it is necessary to compare basic business model design tasks, using a CAD system over its paper-based counterpart. To this end, we carried out an experiment to measure user perceptions of both solutions. Performance was evaluated by applying our guidelines to both solutions and then carrying out a comparison of business model designs. Although CAD did not outperform paper-based design, the results are very encouraging for the future of computer-aided business model design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[eng] We propose two generalizations of the Banzhaf value for partition function form games. In both cases, our approach is based on probability distributions over the set of possible coalition structures that may arise for any given set of agents. First, we introduce a family of values, one for each collection of the latter probability distributions, defined as the Banzhaf value of an expected coalitional game. Then, we provide two characterization results for this new family of values within the framework of all partition function games. Both results rely on a property of neutrality with respect to amalgamation of players. Second, as this collusion transformation fails to be meaningful for simple games in partition function form, we propose another generalization of the Banzhaf value which also builds on probability distributions of the above type. This latter family is characterized by means of a neutrality property which uses an amalgamation transformation of players for which simple games are closed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[eng] We propose two generalizations of the Banzhaf value for partition function form games. In both cases, our approach is based on probability distributions over the set of possible coalition structures that may arise for any given set of agents. First, we introduce a family of values, one for each collection of the latter probability distributions, defined as the Banzhaf value of an expected coalitional game. Then, we provide two characterization results for this new family of values within the framework of all partition function games. Both results rely on a property of neutrality with respect to amalgamation of players. Second, as this collusion transformation fails to be meaningful for simple games in partition function form, we propose another generalization of the Banzhaf value which also builds on probability distributions of the above type. This latter family is characterized by means of a neutrality property which uses an amalgamation transformation of players for which simple games are closed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study presents an innovative pedagogical approach where teachers become game designers and engage in creative teaching practices. Within co-design training workshops, 21 Spanish primary and secondary school teachers have developed their own Game-Based Learning (GBL) scenarios, especially tailored to their teaching contexts and students profiles. In total, teachers developed 13 GBL scenarios and put them into practice in real teaching contexts. The present paper analyses the impacts of this learner-centred game design approach on teachers" creativity from three different points of view: the GBL design process, the GBL scenario, and the teaching processes at stake.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interactions between stimuli's acoustic features and experience-based internal models of the environment enable listeners to compensate for the disruptions in auditory streams that are regularly encountered in noisy environments. However, whether auditory gaps are filled in predictively or restored a posteriori remains unclear. The current lack of positive statistical evidence that internal models can actually shape brain activity as would real sounds precludes accepting predictive accounts of filling-in phenomenon. We investigated the neurophysiological effects of internal models by testing whether single-trial electrophysiological responses to omitted sounds in a rule-based sequence of tones with varying pitch could be decoded from the responses to real sounds and by analyzing the ERPs to the omissions with data-driven electrical neuroimaging methods. The decoding of the brain responses to different expected, but omitted, tones in both passive and active listening conditions was above chance based on the responses to the real sound in active listening conditions. Topographic ERP analyses and electrical source estimations revealed that, in the absence of any stimulation, experience-based internal models elicit an electrophysiological activity different from noise and that the temporal dynamics of this activity depend on attention. We further found that the expected change in pitch direction of omitted tones modulated the activity of left posterior temporal areas 140-200 msec after the onset of omissions. Collectively, our results indicate that, even in the absence of any stimulation, internal models modulate brain activity as do real sounds, indicating that auditory filling in can be accounted for by predictive activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite moderate improvements in outcome of glioblastoma after first-line treatment with chemoradiation recent clinical trials failed to improve the prognosis of recurrent glioblastoma. In the absence of a standard of care we aimed to investigate institutional treatment strategies to identify similarities and differences in the pattern of care for recurrent glioblastoma. We investigated re-treatment criteria and therapeutic pathways for recurrent glioblastoma of eight neuro-oncology centres in Switzerland having an established multidisciplinary tumour-board conference. Decision algorithms, differences and consensus were analysed using the objective consensus methodology. A total of 16 different treatment recommendations were identified based on combinations of eight different decision criteria. The set of criteria implemented as well as the set of treatments offered was different in each centre. For specific situations, up to 6 different treatment recommendations were provided by the eight centres. The only wide-range consensus identified was to offer best supportive care to unfit patients. A majority recommendation was identified for non-operable large early recurrence with unmethylated MGMT promoter status in the fit patients: here bevacizumab was offered. In fit patients with late recurrent non-operable MGMT promoter methylated glioblastoma temozolomide was recommended by most. No other majority recommendations were present. In the absence of strong evidence we identified few consensus recommendations in the treatment of recurrent glioblastoma. This contrasts the limited availability of single drugs and treatment modalities. Clinical situations of greatest heterogeneity may be suitable to be addressed in clinical trials and second opinion referrals are likely to yield diverging recommendations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The diagnosis of Pulmonary Embolism (PE) in the emergency department (ED) is crucial. As emergency physicians fear missing this potential life-threatening condition, PE tends to be over-investigated, exposing patients to unnecessary risks and uncertain benefit in terms of outcome. The Pulmonary Embolism Rule-out Criteria (PERC) is an eight-item block of clinical criteria that can identify patients who can safely be discharged from the ED without further investigation for PE. The endorsement of this rule could markedly reduce the number of irradiative imaging studies, ED length of stay, and rate of adverse events resulting from both diagnostic and therapeutic interventions. Several retrospective and prospective studies have shown the safety and benefits of the PERC rule for PE diagnosis in low-risk patients, but the validity of this rule is still controversial. We hypothesize that in European patients with a low gestalt clinical probability and who are PERC-negative, PE can be safely ruled out and the patient discharged without further testing. METHODS/DESIGN: This is a controlled, cluster randomized trial, in 15 centers in France. Each center will be randomized for the sequence of intervention periods: a 6-month intervention period (PERC-based strategy) followed by a 6-month control period (usual care), or in reverse order, with 2 months of "wash-out" between the 2 periods. Adult patients presenting to the ED with a suspicion of PE and a low pre test probability estimated by clinical gestalt will be eligible. The primary outcome is the percentage of failure resulting from the diagnostic strategy, defined as diagnosed venous thromboembolic events at 3-month follow-up, among patients for whom PE has been initially ruled out. DISCUSSION: The PERC rule has the potential to decrease the number of irradiative imaging studies in the ED, and is reported to be safe. However, no randomized study has ever validated the safety of PERC. Furthermore, some studies have challenged the safety of a PERC-based strategy to rule-out PE, especially in Europe where the prevalence of PE diagnosed in the ED is high. The PROPER study should provide high-quality evidence to settle this issue. If it confirms the safety of the PERC rule, physicians will be able to reduce the number of investigations, associated subsequent adverse events, costs, and ED length of stay for patients with a low clinical probability of PE. TRIAL REGISTRATION: NCT02375919 .