996 resultados para weak solutions
Removal of endotoxin from human serum albumin solutions by hydrophobic and cationic charged membrane
Resumo:
A novel matrix of macropore cellulose membrane was prepared by chemical graft, and immobilized the cationic charged groups as affinity ligands. The prepared membrane Fan be used for the removal of endotoxin from human serum albumin (HSA) solutions. With a cartridge of 20 sheets affinity membrane of 47 mm diameter, the endotoxin level in HSA solution can be reduced ro 0.027 eu/mL. Recovery of HSA was over 95%.
Resumo:
A unique matching is a stated objective of most computational theories of stereo vision. This report describes situations where humans perceive a small number of surfaces carried by non-unique matching of random dot patterns, although a unique solution exists and is observed unambiguously in the perception of isolated features. We find both cases where non-unique matchings compete and suppress each other and cases where they are all perceived as transparent surfaces. The circumstances under which each behavior occurs are discussed and a possible explanation is sketched. It appears that matching reduces many false targets to a few, but may still yield multiple solutions in some cases through a (possibly different) process of surface interpolation.
Resumo:
Malicious software (malware) have significantly increased in terms of number and effectiveness during the past years. Until 2006, such software were mostly used to disrupt network infrastructures or to show coders’ skills. Nowadays, malware constitute a very important source of economical profit, and are very difficult to detect. Thousands of novel variants are released every day, and modern obfuscation techniques are used to ensure that signature-based anti-malware systems are not able to detect such threats. This tendency has also appeared on mobile devices, with Android being the most targeted platform. To counteract this phenomenon, a lot of approaches have been developed by the scientific community that attempt to increase the resilience of anti-malware systems. Most of these approaches rely on machine learning, and have become very popular also in commercial applications. However, attackers are now knowledgeable about these systems, and have started preparing their countermeasures. This has lead to an arms race between attackers and developers. Novel systems are progressively built to tackle the attacks that get more and more sophisticated. For this reason, a necessity grows for the developers to anticipate the attackers’ moves. This means that defense systems should be built proactively, i.e., by introducing some security design principles in their development. The main goal of this work is showing that such proactive approach can be employed on a number of case studies. To do so, I adopted a global methodology that can be divided in two steps. First, understanding what are the vulnerabilities of current state-of-the-art systems (this anticipates the attacker’s moves). Then, developing novel systems that are robust to these attacks, or suggesting research guidelines with which current systems can be improved. This work presents two main case studies, concerning the detection of PDF and Android malware. The idea is showing that a proactive approach can be applied both on the X86 and mobile world. The contributions provided on this two case studies are multifolded. With respect to PDF files, I first develop novel attacks that can empirically and optimally evade current state-of-the-art detectors. Then, I propose possible solutions with which it is possible to increase the robustness of such detectors against known and novel attacks. With respect to the Android case study, I first show how current signature-based tools and academically developed systems are weak against empirical obfuscation attacks, which can be easily employed without particular knowledge of the targeted systems. Then, I examine a possible strategy to build a machine learning detector that is robust against both empirical obfuscation and optimal attacks. Finally, I will show how proactive approaches can be also employed to develop systems that are not aimed at detecting malware, such as mobile fingerprinting systems. In particular, I propose a methodology to build a powerful mobile fingerprinting system, and examine possible attacks with which users might be able to evade it, thus preserving their privacy. To provide the aforementioned contributions, I co-developed (with the cooperation of the researchers at PRALab and Ruhr-Universität Bochum) various systems: a library to perform optimal attacks against machine learning systems (AdversariaLib), a framework for automatically obfuscating Android applications, a system to the robust detection of Javascript malware inside PDF files (LuxOR), a robust machine learning system to the detection of Android malware, and a system to fingerprint mobile devices. I also contributed to develop Android PRAGuard, a dataset containing a lot of empirical obfuscation attacks against the Android platform. Finally, I entirely developed Slayer NEO, an evolution of a previous system to the detection of PDF malware. The results attained by using the aforementioned tools show that it is possible to proactively build systems that predict possible evasion attacks. This suggests that a proactive approach is crucial to build systems that provide concrete security against general and evasion attacks.
Resumo:
Aqueous solutions of amphiphilic polymers usually comprise of inter- and intramolecular associations of hydrophobic groups often leading to a formation of a rheologically significant reversible network at low concentrations that can be identified using techniques such as static light scattering and rheometry. However, in most studies published till date comparing water soluble polymers with their respective amphiphilic derivatives, it has been very difficult to distinguish between the effects of molecular mass versus hydrophobic associations on hydrodynamic (intrinsic viscosity [g]) and thermodynamic parameters (second virial coefficient A2), owing to the differences between their degrees of polymerization. This study focuses on the dilute and semi-dilute solutions of hydroxyethyl cellulose (HEC) and its amphiphilic derivatives (hmHEC) of the same molecular mass, along with other samples having a different molecular mass using capillary viscometry, rheometry and static light scattering. The weight average molecular masses (MW) and their distributions for the nonassociative HEC were determined using size exclusion chromatography. Various empirical approaches developed by past authors to determine [g] from dilute solution viscometry data have been discussed. hmHEC with a sufficiently high degree of hydrophobic modification was found to be forming a rheologically significant network in dilute solutions at very low concentrations as opposed to the hmHEC with a much lower degree of hydrophobic modification which also enveloped the hydrophobic groups inside the supramolecular cluster as shown by their [g] and A2. The ratio A2MW/[g], which takes into account hydrodynamic as well as thermodynamic parameters, was observed to be less for associative polymers compared to that of the non-associative polymers.
Resumo:
Plakhov, A.Y., (2004) 'Precise solutions of the one-dimensional Monge-Kantorovich problem', Sbornik: Mathematics 195(9) pp.1291-1307 RAE2008
Resumo:
A learning based framework is proposed for estimating human body pose from a single image. Given a differentiable function that maps from pose space to image feature space, the goal is to invert the process: estimate the pose given only image features. The inversion is an ill-posed problem as the inverse mapping is a one to many process. Hence multiple solutions exist, and it is desirable to restrict the solution space to a smaller subset of feasible solutions. For example, not all human body poses are feasible due to anthropometric constraints. Since the space of feasible solutions may not admit a closed form description, the proposed framework seeks to exploit machine learning techniques to learn an approximation that is smoothly parameterized over such a space. One such technique is Gaussian Process Latent Variable Modelling. Scaled conjugate gradient is then used find the best matching pose in the space of feasible solutions when given an input image. The formulation allows easy incorporation of various constraints, e.g. temporal consistency and anthropometric constraints. The performance of the proposed approach is evaluated in the task of upper-body pose estimation from silhouettes and compared with the Specialized Mapping Architecture. The estimation accuracy of the Specialized Mapping Architecture is at least one standard deviation worse than the proposed approach in the experiments with synthetic data. In experiments with real video of humans performing gestures, the proposed approach produces qualitatively better estimation results.
Resumo:
Weak references are references that do not prevent the object they point to from being garbage collected. Most realistic languages, including Java, SML/NJ, and OCaml to name a few, have some facility for programming with weak references. Weak references are used in implementing idioms like memoizing functions and hash-consing in order to avoid potential memory leaks. However, the semantics of weak references in many languages are not clearly specified. Without a formal semantics for weak references it becomes impossible to prove the correctness of implementations making use of this feature. Previous work by Hallett and Kfoury extends λgc, a language for modeling garbage collection, to λweak, a similar language with weak references. Using this previously formalized semantics for weak references, we consider two issues related to well-behavedness of programs. Firstly, we provide a new, simpler proof of the well-behavedness of the syntactically restricted fragment of λweak defined previously. Secondly, we give a natural semantic criterion for well-behavedness much broader than the syntactic restriction, which is useful as principle for programming with weak references. Furthermore we extend the result, proved in previously of λgc, which allows one to use type-inference to collect some reachable objects that are never used. We prove that this result holds of our language, and we extend this result to allow the collection of weakly-referenced reachable garbage without incurring the computational overhead sometimes associated with collecting weak bindings (e.g. the need to recompute a memoized function). Lastly we use extend the semantic framework to model the key/value weak references found in Haskell and we prove the Haskell is semantics equivalent to a simpler semantics due to the lack of side-effects in our language.
Resumo:
A weak reference is a reference to an object that is not followed by the pointer tracer when garbage collection is called. That is, a weak reference cannot prevent the object it references from being garbage collected. Weak references remain a troublesome programming feature largely because there is not an accepted, precise semantics that describes their behavior (in fact, we are not aware of any formalization of their semantics). The trouble is that weak references allow reachable objects to be garbage collected, therefore allowing garbage collection to influence the result of a program. Despite this difficulty, weak references continue to be used in practice for reasons related to efficient storage management, and are included in many popular programming languages (Standard ML, Haskell, OCaml, and Java). We give a formal semantics for a calculus called λweak that includes weak references and is derived from Morrisett, Felleisen, and Harper’s λgc. λgc formalizes the notion of garbage collection by means of a rewrite rule. Such a formalization is required to precisely characterize the semantics of weak references. However, the inclusion of a garbage-collection rewrite-rule in a language with weak references introduces non-deterministic evaluation, even if the parameter-passing mechanism is deterministic (call-by-value in our case). This raises the question of confluence for our rewrite system. We discuss natural restrictions under which our rewrite system is confluent, thus guaranteeing uniqueness of program result. We define conditions that allow other garbage collection algorithms to co-exist with our semantics of weak references. We also introduce a polymorphic type system to prove the absence of erroneous program behavior (i.e., the absence of “stuck evaluation”) and a corresponding type inference algorithm. We prove the type system sound and the inference algorithm sound and complete.
Resumo:
Weak references provide the programmer with limited control over the process of memory management. By using them, a programmer can make decisions based on previous actions that are taken by the garbage collector. Although this is often helpful, the outcome of a program using weak references is less predictable due to the nondeterminism they introduce in program evaluation. It is therefore desirable to have a framework of formal tools to reason about weak references and programs that use them. We present several calculi that formalize various aspects of weak references, inspired by their implementation in Java. We provide a calculus to model multiple levels of non-strong references, where a different garbage collection policy is applied to each level. We consider different collection policies such as eager collection and lazy collection. Similar to the way they are implemented in Java, we give the semantics of eager collection to weak references and the semantics of lazy collection to soft references. Moreover, we condition garbage collection on the availability of time and space resources. While time constraints are used in order to restrict garbage collection, space constraints are used in order to trigger it. Finalizers are a problematic feature in Java, especially when they interact with weak references. We provide a calculus to model finalizer evaluation. Since finalizers have little meaning in a language without side-effect, we introduce a limited form of side effect into the calculus. We discuss determinism and the separate notion of uniqueness of (evaluation) outcome. We show that in our calculus, finalizer evaluation does not affect uniqueness of outcome.
Resumo:
This thesis describes a broad range of experiments based on an aerosol flow-tube system to probe the interactions between atmospherically relevant aerosols with trace gases. This apparatus was used to obtain simultaneous optical and size distribution measurements using FTIR and SMPS measurements respectively as a function of relative humidity and aerosol chemical composition. Heterogeneous reactions between various ratios of ammonia gas and acidic aerosols were studied in aerosol form as opposed to bulk solutions. The apparatus is unique, in that it employed two aerosol generation methods to follow the size evolution of the aerosol while allowing detailed spectroscopic investigation of its chemical content. A novel chemiluminescence apparatus was also used to measure [NH4+]. SO2.H2O is an important species as it represents the first intermediate in the overall atmospheric oxidation process of sulfur dioxide to sulfuric acid. This complex was produced within gaseous, aqueous and aerosol SO2 systems. The addition of ammonia, gave mainly hydrogen sulfite tautomers and disulfite ions. These species were prevalent at high humidities enhancing the aqueous nature of sulfur (IV) species. Their weak acidity is evident due to the low [NH4+] produced. An increasing recognition that dicarboxylic acids may contribute significantly to the total acid burden in polluted urban environments is evident in the literature. It was observed that speciation within the oxalic, malonic and succinic systems shifted towards the most ionised form as the relative humidity was increased due to complete protonisation. The addition of ammonia produced ammonium dicarboxylate ions. Less reaction for ammonia with the malonic and succinic species were observed in comparison to the oxalic acid system. This observation coincides with the decrease in acidity of these organic species. The interaction between dicarboxylic acids and ‘sulfurous’/sulfuric acid has not been previously investigated. Therefore the results presented here are original to the field of tropospheric chemistry. SHO3-; S2O52-; HSO4-; SO42- and H1,3,5C2,3,4O4-;C2,3,4O4 2- were the main components found in the complex inorganic-organic systems investigated here. The introduction of ammonia produced ammonium dicarboxylate as well as ammonium disulfite/sulfate ions and increasing the acid concentrations increased the total amount of [NH4+].