26 resultados para Freedom of choice
Resumo:
To exploit the popularity of TCP as still the dominant sender and protocol of choice for transporting data reliably across the heterogeneous Internet, this thesis explores end-to-end performance issues and behaviours of TCP senders when transferring data to wireless end-users. The theme throughout is on end-users located specifically within 802.11 WLANs at the edges of the Internet, a largely untapped area of work. To exploit the interests of researchers wanting to study the performance of TCP accurately over heterogeneous conditions, this thesis proposes a flexible wired-to-wireless experimental testbed that better reflects conditions in the real-world. To exploit the transparent functionalities between TCP in the wired domain and the IEEE 802.11 WLAN protocols, this thesis proposes a more accurate methodology for gauging the transmission and error characteristics of real-world 802.11 WLANs. It also aims to correlate any findings with the functionality of fixed TCP senders. To exploit the popularity of Linux as a popular operating system for many of the Internet’s data servers, this thesis studies and evaluates various sender-side TCP congestion control implementations within the recent Linux v2.6. A selection of the implementations are put under systematic testing using real-world wired-to-wireless conditions in order to screen and present a viable candidate/s for further development and usage in the modern-day heterogeneous Internet. Overall, this thesis comprises a set of systematic evaluations of TCP senders over 802.11 WLANs, incorporating measurements in the form of simulations, emulations, and through the use of a real-world-like experimental testbed. The goal of the work is to ensure that all aspects concerned are comprehensively investigated in order to establish rules that can help to decide under which circumstances the deployment of TCP is optimal i.e. a set of paradigms for advancing the state-of-the-art in data transport across the Internet.
Resumo:
The oxidation of lipids has long been a topic of interest in biological and food sciences, and the fundamental principles of non-enzymatic free radical attack on phospholipids are well established, although questions about detail of the mechanisms remain. The number of end products that are formed following the initiation of phospholipid peroxidation is large, and is continually growing as new structures of oxidized phospholipids are elucidated. Common products are phospholipids with esterified isoprostane-like structures and chain-shortened products containing hydroxy, carbonyl or carboxylic acid groups; the carbonyl-containing compounds are reactive and readily form adducts with proteins and other biomolecules. Phospholipids can also be attacked by reactive nitrogen and chlorine species, further expanding the range of products to nitrated and chlorinated phospholipids. Key to understanding the mechanisms of oxidation is the development of advanced and sensitive technologies that enable structural elucidation. Tandem mass spectrometry has proved invaluable in this respect and is generally the method of choice for structural work. A number of studies have investigated whether individual oxidized phospholipid products occur in vivo, and mass spectrometry techniques have been instrumental in detecting a variety of oxidation products in biological samples such as atherosclerotic plaque material, brain tissue, intestinal tissue and plasma, although relatively few have achieved an absolute quantitative analysis. The levels of oxidized phospholipids in vivo is a critical question, as there is now substantial evidence that many of these compounds are bioactive and could contribute to pathology. The challenges for the future will be to adopt lipidomic approaches to map the profile of oxidized phospholipid formation in different biological conditions, and relate this to their effects in vivo. This article is part of a Special Issue entitled: Oxidized phospholipids-their properties and interactions with proteins.
Resumo:
The purpose of the present study is to make a comparative evaluation of the legislative controls on unfairness in the context of B2B, B2C and small businesses contracts in England and Brazil. This work will focus on the examination of statutes and relevant case law which regulate exemption clauses and terms on the basis of their ‘unfairness’. The approach adopted by legislation and courts towards the above controls may vary according to the type of contract. Business contracts are more in line with the classical model of contract law according to which parties are presumably equals and able to negotiate terms. As a consequence interventions should be avoided for the sake of freedom of contract even if harmful terms were included. Such assumption of equality however is not applicable to small businesses contracts because SMEs are often in a disadvantageous position in relation to their larger counterparties. Consumer contracts in their turn are more closely regulated by the English and Brazilian legal systems which recognised that vulnerable parties are more exposed to unfair terms imposed by the stronger party as a result of the inequality of bargaining power. For this reason those jurisdictions adopted a more interventionist approach to provide special protection to consumers which is in line with the modern law of contract. The contribution of this work therefore consists of comparing how the law of England and Brazil tackles the problem of ‘unfairness’ in the above types of contracts. This study will examine the differences and similarities between rules and concepts of both jurisdictions with references to the law of their respective regional trade agreements (EU and the Mercosul). Moreover it will identify existing issues in the English and Brazilian legislation and recommend lessons that one system can learn from the other.
Resumo:
Mergers and acquisitions (M&) are increasingly becoming a strategy of choice for companies attempting to achieve and sustain competitive advantage. However, not all M&As are a success. In this paper, we examine the three main reasons highlighted in the literature as major causes of M&A failure (clashing corporate cultures, absence of clear communication, and employee involvement) in three Indian pharmaceutical companies, and we analyze the role played by the HR function in addressing them. Also, we discuss the importance of gaining the commitment and focus of the workforce during the acquisition process through employee involvement.
Resumo:
Mechanical physiological pulsations are movements of a body surface incited by the movements of muscles in organs inside the body. Here we demonstrate the use of long-period grating sensors in the detection of cardio-vascular pulsations (CVP), in particular apex and carotid pulsations. To calibrate the sensors, we use a mechanical tool designed specifically to measure the sensor response to a localized perturbation at different grating curvatures as working points. From the data we infer the amplitude of the CVP. Together with the electrophysiological signals, the CVP signals obtained from the sensors can provide significant information on heart function which is inaccessible to the electrocardiogram. The low cost and easy handling of the fibre sensors increase their prospects to become the sensors of choice for novel diagnostic devices. © 2013 The Royal Swedish Academy of Sciences.
Resumo:
This thesis presents a two-dimensional water model investigation and development of a multiscale method for the modelling of large systems, such as virus in water or peptide immersed in the solvent. We have implemented a two-dimensional ‘Mercedes Benz’ (MB) or BN2D water model using Molecular Dynamics. We have studied its dynamical and structural properties dependence on the model’s parameters. For the first time we derived formulas to calculate thermodynamic properties of the MB model in the microcanonical (NVE) ensemble. We also derived equations of motion in the isothermal–isobaric (NPT) ensemble. We have analysed the rotational degree of freedom of the model in both ensembles. We have developed and implemented a self-consistent multiscale method, which is able to communicate micro- and macro- scales. This multiscale method assumes, that matter consists of the two phases. One phase is related to micro- and the other to macroscale. We simulate the macro scale using Landau Lifshitz-Fluctuating Hydrodynamics, while we describe the microscale using Molecular Dynamics. We have demonstrated that the communication between the disparate scales is possible without introduction of fictitious interface or approximations which reduce the accuracy of the information exchange between the scales. We have investigated control parameters, which were introduced to control the contribution of each phases to the matter behaviour. We have shown, that microscales inherit dynamical properties of the macroscales and vice versa, depending on the concentration of each phase. We have shown, that Radial Distribution Function is not altered and velocity autocorrelation functions are gradually transformed, from Molecular Dynamics to Fluctuating Hydrodynamics description, when phase balance is changed. In this work we test our multiscale method for the liquid argon, BN2D and SPC/E water models. For the SPC/E water model we investigate microscale fluctuations which are computed using advanced mapping technique of the small scales to the large scales, which was developed by Voulgarakisand et. al.
Resumo:
Questions whether the focus on freedom of expression under the Defamation Act 2013 could undermine the value of corporate reputation as a commercial asset.
Resumo:
Multiple transformative forces target marketing, many of which derive from new technologies that allow us to sample thinking in real time (i.e., brain imaging), or to look at large aggregations of decisions (i.e., big data). There has been an inclination to refer to the intersection of these technologies with the general topic of marketing as “neuromarketing”. There has not been a serious effort to frame neuromarketing, which is the goal of this paper. Neuromarketing can be compared to neuroeconomics, wherein neuroeconomics is generally focused on how individuals make “choices”, and represent distributions of choices. Neuromarketing, in contrast, focuses on how a distribution of choices can be shifted or “influenced”, which can occur at multiple “scales” of behavior (e.g., individual, group, or market/society). Given influence can affect choice through many cognitive modalities, and not just that of valuation of choice options, a science of influence also implies a need to develop a model of cognitive function integrating attention, memory, and reward/aversion function. The paper concludes with a brief description of three domains of neuromarketing application for studying influence, and their caveats.
Resumo:
Introduction - The Dutch implementation of the black border provision in the 2001 European Union Tobacco Products Directive (TPD) is studied to examine the implications of tobacco industry involvement in the implementation phase of the policy process. Methods - A qualitative analysis was conducted of Dutch government documents obtained through Freedom of Information Act requests, triangulated with in-depth interviews with key informants and secondary data sources (publicly available government documents, scientific literature, and news articles). Results - Tobacco manufacturers’ associations were given the opportunity to set implementation specifications via a fast-track deal with the government. The offer of early implementation of the labelling section of the TPD was used as political leverage by the industry, and underpinned by threats of litigation and arguments highlighting the risks of additional public costs and the benefits to the government of expediency and speed. Ultimately, the government agreed to the industry's interpretation, against the advice of the European Commission. Conclusions - The findings highlight the policy risks associated with corporate actors’ ability to use interactions over technical product specifications to influence the implementation of health policy and illustrate the difficulties in limiting industry interference in accordance with Article 5.3 of the Framework Convention on Tobacco Control (FCTC). The implementation phase is particularly vulnerable to industry influence, where negotiation with industry actors may be unavoidable and the practical implications of relatively technical considerations are not always apparent to policymakers. During the implementation of the new TPD 2014/40/EU, government officials are advised to take a proactive role in stipulating technical specifications.
Resumo:
This paper considers the impact of new media on freedom of expression and media freedom within the context of the European Convention on Human Rights and European Court of Human Rights jurisprudence. Through comparative analysis of US jurisprudence and scholarship, this paper deals with the following three issues. First, it explores the traditional purpose of the media, and how media freedom, as opposed to freedom of expression, has been subject to privileged protection, within an ECHR context at least. Secondly, it considers the emergence of new media, and how it can be differentiated from the traditional media. Finally, it analyses the philosophical justifications for freedom of expression, and how they enable a workable definition of the media based upon the concept of the media-as-a-constitutional-component.
Resumo:
Saturation mutagenesis is a powerful tool in modern protein engineering, which permits key residues within a protein to be targeted in order to potentially enhance specific functionalities. However, the creation of large libraries using conventional saturation mutagenesis with degenerate codons (NNN or NNK/S) has inherent redundancy and consequent disparities in codon representation. Therefore, both chemical (trinucleotide phosphoramidites) and biological methods (sequential, enzymatic single codon additions) of non-degenerate saturation mutagenesis have been developed in order to combat these issues and so improve library quality. Large libraries with multiple saturated positions can be limited by the method used to screen them. Although the traditional screening method of choice, cell-dependent methods, such as phage display, are limited by the need for transformation. A number of cell-free screening methods, such as CIS display, which link the screened phenotype with the encoded genotype, have the capability of screening libraries with up to 1014 members. This thesis describes the further development of ProxiMAX technology to reduce library codon bias and its integration with CIS display to screen the resulting library. Synthetic MAX oligonucleotides are ligated to an acceptor base sequence, amplified, and digested, subsequently adding a randomised codon to the acceptor, which forms an iterative cycle using the digested product of the previous cycle as the base sequence for the next. Initial use of ProxiMAX highlighted areas of the process where changes could be implemented in order to improve the codon representation in the final library. The refined process was used to construct a monomeric anti-NGF peptide library, based on two proprietary dimeric peptides (Isogenica) that bind NGF. The resulting library showed greatly improved codon representation that equated to a theoretical diversity of ~69%. The library was subsequently screened using CIS display and the discovered peptides assessed for NGF-TrkA inhibition by ELISA. Despite binding to TrkA, these peptides showed lower levels of inhibition of the NGF-TrkA interaction than the parental dimeric peptides, highlighting the importance of dimerization for inhibition of NGF-TrkA binding.