194 resultados para sicurezza, exploit, XSS, Beef, browser
Resumo:
Secure protocols for password-based user authentication are well-studied in the cryptographic literature but have failed to see wide-spread adoption on the Internet; most proposals to date require extensive modifications to the Transport Layer Security (TLS) protocol, making deployment challenging. Recently, a few modular designs have been proposed in which a cryptographically secure password-based mutual authentication protocol is run inside a confidential (but not necessarily authenticated) channel such as TLS; the password protocol is bound to the established channel to prevent active attacks. Such protocols are useful in practice for a variety of reasons: security no longer relies on users' ability to validate server certificates and can potentially be implemented with no modifications to the secure channel protocol library. We provide a systematic study of such authentication protocols. Building on recent advances in modelling TLS, we give a formal definition of the intended security goal, which we call password-authenticated and confidential channel establishment (PACCE). We show generically that combining a secure channel protocol, such as TLS, with a password authentication protocol, where the two protocols are bound together using either the transcript of the secure channel's handshake or the server's certificate, results in a secure PACCE protocol. Our prototype based on TLS is available as a cross-platform client-side Firefox browser extension and a server-side web application which can easily be installed on deployed web browsers and servers.
Resumo:
AUTHENTIC IN ALL CAPS is an award-winning web audio adventure for the iPad and Chrome browser. The app combines audio drama, audio tours, and online storytelling. You travel across the web with characters who face ridiculous obstacles to being themselves. It's about identity, mortality, and pizza toppings. It's an audio drama for people who live on the Internet.
Resumo:
Enterohaemorrhagic Escherichia coli (EHEC) are a subgroup of Shiga toxin-producing E. coli that cause gastrointestinal disease with the potential for life-threatening sequelae. Cattle serve as the natural reservoir for EHEC and outbreaks occur sporadically as a result of contaminated beef and other farming products. While certain EHEC virulence mechanisms have been extensively studied, the factors that mediate host colonization are poorly defined. Previously, we identified four proteins (EhaA,B,C,D) from the prototypic EHEC strain EDL933 that belong to the autotransporter (AT) family. Here we characterize the EhaB AT protein. EhaB was shown to be located at the cell surface and overexpression in E. coli K-12 resulted in significant biofilm formation under continuous flow conditions. Overexpression of EhaB in E. coli K12 and EDL933 backgrounds also promoted adhesion to the extracellular matrix proteins collagen I and laminin. An EhaB-specific antibody revealed that EhaB is expressed in E. coli EDL933 following in vitro growth. EhaB also cross-reacted with serum IgA from cattle challenged with E. coli O157:H7, indicating that EhaB is expressed in vivo and elicits a host IgA immune response.
Resumo:
In this paper new online adaptive hidden Markov model (HMM) state estimation schemes are developed, based on extended least squares (ELS) concepts and recursive prediction error (RPE) methods. The best of the new schemes exploit the idempotent nature of Markov chains and work with a least squares prediction error index, using a posterior estimates, more suited to Markov models then traditionally used in identification of linear systems.
Resumo:
Purpose – The purpose of this paper is to examine empirically, an industry development paradox, using embryonic literature in the area of strategic supply chain management, together with innovation management literature. This study seeks to understand how, forming strategic supply chain relationships, and developing strategic supply chain capability, influences beneficial supply chain outcomes expected from utilizing industry-led innovation, in the form of electronic business solutions using the internet, in the Australian beef industry. Findings should add valuable insights to both academics and practitioners in the fields of supply chain innovation management and strategic supply chain management, and expand knowledge to current literature. Design/methodology/approach – This is a quantitative study comparing innovative and non-innovative supply chain operatives in the Australian beef industry, through factor analysis and structural equation modeling using PAWS Statistical V18 and AMOS V18 to analyze survey data from 412 respondents from the Australian beef supply chain. Findings – Key findings are that both innovative and non-innovative supply chain operators attribute supply chain synchronization as only a minor indicator of strategic supply chain capability, contrary to the literature; and they also indicate strategic supply chain capability has a minor influence in achieving beneficial outcomes from utilizing industry-led innovation. These results suggest a lack of coordination between supply chain operatives in the industry. They also suggest a lack of understanding of the benefits of developing a strategic supply chain management competence, particularly in relation to innovation agendas, and provides valuable insights as to why an industry paradox exists in terms of the level of investment in industry-led innovation, vs the level of corresponding benefit achieved. Research limitations/implications – Results are not generalized due to the single agribusiness industry studied and the single research method employed. However, this provides opportunity for further agribusiness studies in this area and also studies using alternate methods, such as qualitative, in-depth analysis of these factors and their relationships, which may confirm results or produce different results. Further, this study empirically extends existing theoretical contributions and insights into the roles of strategic supply chain management and innovation management in improving supply chain and ultimately industry performance while providing practical insights to supply chain practitioners in this and other similar agribusiness industries. Practical implications – These findings confirm results from a 2007 research (Ketchen et al., 2007) which suggests supply chain practice and teachings need to take a strategic direction in the twenty-first century. To date, competence in supply chain management has built up from functional and process orientations rather than from a strategic perspective. This study confirms that there is a need for more generalists that can integrate with various disciplines, particularly those who can understand and implement strategic supply chain management. Social implications – Possible social implications accrue through the development of responsible government policy in terms of industry supply chains. Strategic supply chain management and supply chain innovation management have impacts to the social fabric of nations through the sustainability of their industries, especially agribusiness industries which deal with food safety and security. If supply chains are now the competitive weapon of nations then funding innovation and managing their supply chain competitiveness in global markets requires a strategic approach from everyone, not just the industry participants. Originality/value – This is original empirical research, seeking to add value to embryonic and important developing literature concerned with adopting a strategic approach to supply chain management. It also seeks to add to existing literature in the area of innovation management, particularly through greater understanding of the implications of nations developing industry-wide, industry-led innovation agendas, and their ramifications to industry supply chains.
Resumo:
The prospect of economically producing useful biologics in plants has greatly increased with the advent of viral vectors. The ability of viral vectors to amplify transgene expression has seen them develop into robust transient platforms for the high-level, rapid production of recombinant proteins. To adapt these systems to stably transformed plants, new ways of deconstructing the virus machinery and linking its expression and replication to chemically controlled promoters have been developed. The more advanced of these stable, inducible hyper-expression vectors provide both activated and amplified heterologous transgene expression. Such systems could be deployed in broad acre crops and provide a pathway to fully exploit the advantages of plants as a platform for the manufacture of a wide spectrum of products.
Resumo:
High-Order Co-Clustering (HOCC) methods have attracted high attention in recent years because of their ability to cluster multiple types of objects simultaneously using all available information. During the clustering process, HOCC methods exploit object co-occurrence information, i.e., inter-type relationships amongst different types of objects as well as object affinity information, i.e., intra-type relationships amongst the same types of objects. However, it is difficult to learn accurate intra-type relationships in the presence of noise and outliers. Existing HOCC methods consider the p nearest neighbours based on Euclidean distance for the intra-type relationships, which leads to incomplete and inaccurate intra-type relationships. In this paper, we propose a novel HOCC method that incorporates multiple subspace learning with a heterogeneous manifold ensemble to learn complete and accurate intra-type relationships. Multiple subspace learning reconstructs the similarity between any pair of objects that belong to the same subspace. The heterogeneous manifold ensemble is created based on two-types of intra-type relationships learnt using p-nearest-neighbour graph and multiple subspaces learning. Moreover, in order to make sure the robustness of clustering process, we introduce a sparse error matrix into matrix decomposition and develop a novel iterative algorithm. Empirical experiments show that the proposed method achieves improved results over the state-of-art HOCC methods for FScore and NMI.
Resumo:
Background In vision, there is a trade-off between sensitivity and resolution, and any eye which maximises information gain at low light levels needs to be large. This imposes exacting constraints upon vision in nocturnal flying birds. Eyes are essentially heavy, fluid-filled chambers, and in flying birds their increased size is countered by selection for both reduced body mass and the distribution of mass towards the body core. Freed from these mass constraints, it would be predicted that in flightless birds nocturnality should favour the evolution of large eyes and reliance upon visual cues for the guidance of activity. Methodology/Principal Findings We show that in Kiwi (Apterygidae), flightlessness and nocturnality have, in fact, resulted in the opposite outcome. Kiwi show minimal reliance upon vision indicated by eye structure, visual field topography, and brain structures, and increased reliance upon tactile and olfactory information. Conclusions/Significance This lack of reliance upon vision and increased reliance upon tactile and olfactory information in Kiwi is markedly similar to the situation in nocturnal mammals that exploit the forest floor. That Kiwi and mammals evolved to exploit these habitats quite independently provides evidence for convergent evolution in their sensory capacities that are tuned to a common set of perceptual challenges found in forest floor habitats at night and which cannot be met by the vertebrate visual system. We propose that the Kiwi visual system has undergone adaptive regressive evolution driven by the trade-off between the relatively low rate of gain of visual information that is possible at low light levels, and the metabolic costs of extracting that information.
Resumo:
We describe an investigation into how Massey University’s Pollen Classifynder can accelerate the understanding of pollen and its role in nature. The Classifynder is an imaging microscopy system that can locate, image and classify slide based pollen samples. Given the laboriousness of purely manual image acquisition and identification it is vital to exploit assistive technologies like the Classifynder to enable acquisition and analysis of pollen samples. It is also vital that we understand the strengths and limitations of automated systems so that they can be used (and improved) to compliment the strengths and weaknesses of human analysts to the greatest extent possible. This article reviews some of our experiences with the Classifynder system and our exploration of alternative classifier models to enhance both accuracy and interpretability. Our experiments in the pollen analysis problem domain have been based on samples from the Australian National University’s pollen reference collection (2,890 grains, 15 species) and images bundled with the Classifynder system (400 grains, 4 species). These samples have been represented using the Classifynder image feature set.We additionally work through a real world case study where we assess the ability of the system to determine the pollen make-up of samples of New Zealand honey. In addition to the Classifynder’s native neural network classifier, we have evaluated linear discriminant, support vector machine, decision tree and random forest classifiers on these data with encouraging results. Our hope is that our findings will help enhance the performance of future releases of the Classifynder and other systems for accelerating the acquisition and analysis of pollen samples.
Resumo:
We describe an investigation into how Massey University's Pollen Classifynder can accelerate the understanding of pollen and its role in nature. The Classifynder is an imaging microscopy system that can locate, image and classify slide based pollen samples. Given the laboriousness of purely manual image acquisition and identification it is vital to exploit assistive technologies like the Classifynder to enable acquisition and analysis of pollen samples. It is also vital that we understand the strengths and limitations of automated systems so that they can be used (and improved) to compliment the strengths and weaknesses of human analysts to the greatest extent possible. This article reviews some of our experiences with the Classifynder system and our exploration of alternative classifier models to enhance both accuracy and interpretability. Our experiments in the pollen analysis problem domain have been based on samples from the Australian National University's pollen reference collection (2890 grains, 15 species) and images bundled with the Classifynder system (400 grains, 4 species). These samples have been represented using the Classifynder image feature set. In addition to the Classifynder's native neural network classifier, we have evaluated linear discriminant, support vector machine, decision tree and random forest classifiers on these data with encouraging results. Our hope is that our findings will help enhance the performance of future releases of the Classifynder and other systems for accelerating the acquisition and analysis of pollen samples. © 2013 AIP Publishing LLC.
Resumo:
We describe a sequence of experiments investigating the strengths and limitations of Fukushima's neocognitron as a handwritten digit classifier. Using the results of these experiments as a foundation, we propose and evaluate improvements to Fukushima's original network in an effort to obtain higher recognition performance. The neocognitron's performance is shown to be strongly dependent on the choice of selectivity parameters and we present two methods to adjust these variables. Performance of the network under the more effective of the two new selectivity adjustment techniques suggests that the network fails to exploit the features that distinguish different classes of input data. To avoid this shortcoming, the network's final layer cells were replaced by a nonlinear classifier (a multilayer perceptron) to create a hybrid architecture. Tests of Fukushima's original system and the novel systems proposed in this paper suggest that it may be difficult for the neocognitron to achieve the performance of existing digit classifiers due to its reliance upon the supervisor's choice of selectivity parameters and training data. These findings pertain to Fukushima's implementation of the system and should not be seen as diminishing the practical significance of the concept of hierarchical feature extraction embodied in the neocognitron. © 1997 IEEE.
Resumo:
Androgens regulate biological pathways to promote proliferation, differentiation, and survival of benign and malignant prostate tissue. Androgen receptor (AR) targeted therapies exploit this dependence and are used in advanced prostate cancer to control disease progression. Contemporary treatment regimens involve sequential use of inhibitors of androgen synthesis or AR function. Although targeting the androgen axis has clear therapeutic benefit, its effectiveness is temporary, as prostate tumor cells adapt to survive and grow. The removal of androgens (androgen deprivation) has been shown to activate both epithelial-to-mesenchymal transition (EMT) and neuroendocrine transdifferentiation (NEtD) programs. EMT has established roles in promoting biological phenotypes associated with tumor progression (migration/invasion, tumor cell survival, cancer stem cell-like properties, resistance to radiation and chemotherapy) in multiple human cancer types. NEtD in prostate cancer is associated with resistance to therapy, visceral metastasis, and aggressive disease. Thus, activation of these programs via inhibition of the androgen axis provides a mechanism by which tumor cells can adapt to promote disease recurrence and progression. Brachyury, Axl, MEK, and Aurora kinase A are molecular drivers of these programs, and inhibitors are currently in clinical trials to determine therapeutic applications. Understanding tumor cell plasticity will be important in further defining the rational use of androgen-targeted therapies clinically and provides an opportunity for intervention to prolong survival of men with metastatic prostate cancer.
Resumo:
SIMON is a family of 10 lightweight block ciphers published by Beaulieu et al. from the United States National Security Agency (NSA). A cipher in this family with K -bit key and N -bit block is called SIMON N/K . We present several linear characteristics for reduced-round SIMON32/64 that can be used for a key-recovery attack and extend them further to attack other variants of SIMON. Moreover, we provide results of key recovery analysis using several impossible differential characteristics starting from 14 out of 32 rounds for SIMON32/64 to 22 out of 72 rounds for SIMON128/256. In some cases the presented observations do not directly yield an attack, but provide a basis for further analysis for the specific SIMON variant. Finally, we exploit a connection between linear and differential characteristics for SIMON to construct linear characteristics for different variants of reduced-round SIMON. Our attacks extend to all variants of SIMON covering more rounds compared to any known results using linear cryptanalysis. We present a key recovery attack against SIMON128/256 which covers 35 out of 72 rounds with data complexity 2123 . We have implemented our attacks for small scale variants of SIMON and our experiments confirm the theoretical bias presented in this work.
Resumo:
Current approaches for purifying plasmids from bacterial production systems exploit the physiochemical properties of nucleic acids in non-specific capture systems. In this study, an affinity system for plasmid DNA (pDNA) purification has been developed utilizing the interaction between the lac operon (lacO) sequence contained in the pDNA and a 64mer synthetic peptide representing the DNA-binding domain of the lac repressor protein, LacI. Two plasmids were evaluated, the native pUC19 and pUC19 with dual lacO3/lacOs operators (pUC19lacO3/lacOs), where the lacOs operator is perfectly symmetrical. The DNA-protein affinity interaction was evaluated by surface plasmon resonance using a Biacore system. The affinity capture of DNA in a chromatography system was evaluated using LacI peptide that had been immobilized to Streamline™ adsorbent. The KD-values for double stranded DNA (dsDNA) fragments containing lacO1 and lacO3 and lacOs and lacO3 were 5.7 ± 0.3 × 10 -11 M and 4.1 ± 0.2 × 10-11 M respectively, which compare favorably with literature reports of 5 × 10-10 - 1 × 10-9 M for native laCO1 and 1-1.2 × 10-10 M for lacO1 in a saline buffer. Densitometric analysis of the gel bands from the affinity chromatography run clearly showed a significant preference for capture of the supercoiled fraction from the feed pDNA sample. The results indicate the feasibility of the affinity approach for pDNA capture and purification using native protein-DNA interaction.
Resumo:
In this paper we present research adapting a state of the art condition-invariant robotic place recognition algorithm to the role of automated inter- and intra-image alignment of sensor observations of environmental and skin change over time. The approach involves inverting the typical criteria placed upon navigation algorithms in robotics; we exploit rather than attempt to fix the limited camera viewpoint invariance of such algorithms, showing that approximate viewpoint repetition is realistic in a wide range of environments and medical applications. We demonstrate the algorithms automatically aligning challenging visual data from a range of real-world applications: ecological monitoring of environmental change, aerial observation of natural disasters including flooding, tsunamis and bushfires and tracking wound recovery and sun damage over time and present a prototype active guidance system for enforcing viewpoint repetition. We hope to provide an interesting case study for how traditional research criteria in robotics can be inverted to provide useful outcomes in applied situations.