62 resultados para forward and backward secrecy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of genetically modified (GM) crops has led the European Union (EU) to put forward the concept of 'coexistence' to give fanners the freedom to plant both conventional and GM varieties. Should a premium for non-GM varieties emerge in the market, 'contamination' by GM pollen would generate a negative externality to conventional growers. It is therefore important to assess the effect of different 'policy variables'on the magnitude of the externality to identify suitable policies to manage coexistence. In this paper, taking GM herbicide tolerant oilseed rape as a model crop, we start from the model developed in Ceddia et al. [Ceddia, M.G., Bartlett, M., Perrings, C., 2007. Landscape gene flow, coexistence and threshold effect: the case of genetically modified herbicide tolerant oilseed rape (Brassica napus). Ecol. Modell. 205, pp. 169-180] use a Monte Carlo experiment to generate data and then estimate the effect of the number of GM and conventional fields, width of buffer areas and the degree of spatial aggregation (i.e. the 'policy variables') on the magnitude of the externality at the landscape level. To represent realistic conditions in agricultural production, we assume that detection of GM material in conventional produce might occur at the field level (no grain mixing occurs) or at the silos level (where grain mixing from different fields in the landscape occurs). In the former case, the magnitude of the externality will depend on the number of conventional fields with average transgenic presence above a certain threshold. In the latter case, the magnitude of the externality will depend on whether the average transgenic presence across all conventional fields exceeds the threshold. In order to quantify the effect of the relevant' policy variables', we compute the marginal effects and the elasticities. Our results show that when relying on marginal effects to assess the impact of the different 'policy variables', spatial aggregation is far more important when transgenic material is detected at field level, corroborating previous research. However, when elasticity is used, the effectiveness of spatial aggregation in reducing the externality is almost identical whether detection occurs at field level or at silos level. Our results show also that the area planted with GM is the most important 'policy variable' in affecting the externality to conventional growers and that buffer areas on conventional fields are more effective than those on GM fields. The implications of the results for the coexistence policies in the EU are discussed. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The area of soil disturbed using a single tine is well documented. However, modern strip tillage implements using a tine and disc design have not been assessed in the UK or in mainland Europe. Using a strip tillage implement has potential benefits for European agriculture where economic returns and sustainability are key issues. Using a strip tillage system a narrow zone is cultivated leaving most of the straw residue on the soil surface. Small field plot experiments were undertaken on three soil types and the operating parameters of forward speed, tine depth and tine design were investigated together with measurements of seedbed tilth and crop emergence. The type of tine used was found to be the primary factor in achieving the required volume of disturbance within a narrow zone whilst maintaining an area of undisturbed soil with straw residue on the surface. The winged tine produced greater disturbance at a given depth compared with the knife tine. Increasing forward speed did not consistently increase the volume of disturbance. In a sandy clay loam the tilth created and emergence of sugar beet by strip tillage and ploughing were similar but on a sandy loam the strip tillage treatments generally gave a finer tilth but poorer emergence particularly at greater working depth.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tridentate Schiff base ligand, 7-amino-4-methyl-5-aza-3-hepten-2-one (HAMAH), prepared by the mono-condensation of 1,2diaminoethane and acetylacetone, reacts with Cu(BF4)(2) center dot 6H(2)O to produce initially a dinuclear Cu(II) complex, [{Cu(AMAH)}(2) (mu-4,4'-bipyJ](BF4)(2) (1) which undergoes hydrolysis in the reaction mixture and finally produces a linear polymeric chain compound, [Cu(acac)(2)(mu-4,4'-bipy)](n) (2). The geometry around the copper atom in compound 1 is distorted square planar while that in compound 2 is essentially an elongated octahedron. On the other hand, the ligand HAMAH reacts with Cu(ClO4)(2) center dot 6H(2)O to yield a polymeric zigzag chain, [{Cu(acac)(CH3OH)(mu-4,4'-bipy)}(ClO4)](n) (3). The geometry of the copper atom in 3 is square pyramidal with the two bipyridine molecules in the cis equatorial positions. All three complexes have been characterized by elemental analysis, IR and UV-Vis spectroscopy and single crystal X-ray diffraction studies. A probable explanation for the different size and shape of the reported polynuclear complexes formed by copper(II) and 4,4'-bipyridine has been put forward by taking into account the denticity and crystal field strength of the blocking ligand as well as the Jahn-Teller effect in copper(II). (c) 2007 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An exploratory model for cutting is presented which incorporates fracture toughness as well as the commonly considered effects of plasticity and friction. The periodic load fluctuations Been in cutting force dynamometer tests are predicted, and considerations of chatter and surface finish follow. A non-dimensional group is put forward to classify different regimes of material response to machining. It leads to tentative explanations for the difficulties of cutting materials such as ceramics and brittlo polymers, and also relates to the formation of discontinuous chips. Experiments on a range of solids with widely varying toughness/strength ratios generally agree with the analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Firms form consortia in order to win contracts. Once a project has been awarded to a consortium each member then concentrates on his or her own contract with the client. Therefore, consortia are marketing devices, which present the impression of teamworking, but the production process is just as fragmented as under conventional procurement methods. In this way, the consortium forms a barrier between the client and the actual construction production process. Firms form consortia, not as a simple development of normal ways of working, but because the circumstances for specific projects make it a necessary vehicle. These circumstances include projects that are too large or too complex to undertake alone or projects that require on-going services which cannot be provided by the individual firms inhouse. It is not a preferred way of working, because participants carry extra risk in the form of liability for the actions of their partners in the consortium. The behaviour of members of consortia is determined by their relative power, based on several factors, including financial commitment and ease of replacement. The level of supply chain visibility to the public sector client and to the industry is reduced by the existence of a consortium because the consortium forms an additional obstacle between the client and the firms undertaking the actual construction work. Supply chain visibility matters to the client who otherwise loses control over the process of construction or service provision, while remaining accountable for cost overruns. To overcome this separation there is a convincing argument in favour of adopting the approach put forward in the Project Partnering Contract 2000 (PPC2000) Agreement. Members of consortia do not necessarily go on to work in the same consortia again because members need to respond flexibly to opportunities as and when they arise. Decision-making processes within consortia tend to be on an ad hoc basis. Construction risk is taken by the contractor and the construction supply chain but the reputational risk is carried by all the firms associated with a consortium. There is a wide variation in the manner that consortia are formed, determined by the individual circumstances of each project; its requirements, size and complexity, and the attitude of individual project leaders. However, there are a number of close working relationships based on generic models of consortia-like arrangements for the purpose of building production, such as the Housing Corporation Guidance Notes and the PPC2000.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building energy consumption(BEC) accounting and assessment is fundamental work for building energy efficiency(BEE) development. In existing Chinese statistical yearbook, there is no specific item for BEC accounting and relevant data are separated and mixed with other industry consumption. Approximate BEC data can be acquired from existing energy statistical yearbook. For BEC assessment, caloric values of different energy carriers are adopted in energy accounting and assessment field. This methodology obtained much useful conclusion for energy efficiency development. While the traditional methodology concerns only on the energy quantity, energy classification issue is omitted. Exergy methodology is put forward to assess BEC. With the new methodology, energy quantity and quality issues are both concerned in BEC assessment. To illustrate the BEC accounting and exergy assessment, a case of Chongqing in 2004 is shown. Based on the exergy analysis, BEC of Chongqing in 2004 accounts for 17.3% of the total energy consumption. This result is quite common to that of traditional methodology. As far as energy supply efficiency is concerned, the difference is highlighted by 0.417 of the exergy methodology to 0.645 of the traditional methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to propose a process model for knowledge transfer in using theories relating knowledge communication and knowledge translation. Design/methodology/approach – Most of what is put forward in this paper is based on a research project titled “Procurement for innovation and knowledge transfer (ProFIK)”. The project is funded by a UK government research council – The Engineering and Physical Sciences Research Council (EPSRC). The discussions are mainly grounded on a thorough review of literature accomplished as part of the research project. Findings – The process model developed in this paper has built upon the theory of knowledge transfer and the theory of communication. Knowledge transfer, per se, is not a mere transfer of knowledge. It involves different stages of knowledge transformation. Depending on the context of knowledge transfer, it can also be influenced by many factors; some positive and some negative. The developed model of knowledge transfer attempts to encapsulate all these issues in order to create a holistic framework. Originality/value of paper – An attempt has been made in the paper to combine some of the significant theories or findings relating to knowledge transfer together, making the paper an original and valuable one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Performance analysis has been used for many applications including providing feedback to coaches and players, media applications, scoring of sports performance and scientific research into sports performance. The current study has used performance analysis to generate knowledge relating to the demands of netball competition which has been used in the development of a Netball Specific Fitness Test (NSFT). A modified version of the Bloomfield movement classification was used to provide a detailed analysis of player movement during netball competition. This was considered during a needs analysis when proposing the structure of the NSFT. A series of pilot versions were tested during an evolutionary prototyping process that resulted in the final version of the NSFT, which was found to be representative of movement in netball competition and it distinguished between recreational club players and players of university first team level or above. The test is incremental and involves forward, backward and sideways movement, jumping, lunging, turning and choice reaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Posterior cortical atrophy (PCA) is a type of dementia that is characterized by visuo-spatial and memory deficits, dyslexia and dysgraphia, relatively early onset and preserved insight. Language deficits have been reported in some cases of PCA. Using an off-line grammaticality judgement task, processing of wh-questions is investigated in a case of PCA. Other aspects of auditory language are also reported. It is shown that processing of wh-questions is influenced by syntactic structure, a novel finding in this condition. The results are discussed with reference to accounts of wh-questions in aphasia. An uneven profile of other language abilities is reported with deficits in digit span (forward, backward), story retelling ability, comparative questions but intact abilities in following commands, repetition, concept definition, generative naming and discourse comprehension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 'irrelevant sound effect' in short-term memory is commonly believed to entail a number of direct consequences for cognitive performance in the office and other workplaces (e.g. S. P. Banbury, S. Tremblay, W. J. Macken, & D. M. Jones, 2001). It may also help to identify what types of sound are most suitable as auditory warning signals. However, the conclusions drawn are based primarily upon evidence from a single task (serial recall) and a single population (young adults). This evidence is reconsidered from the standpoint of different worker populations confronted with common workplace tasks and auditory environments. Recommendations are put forward for factors to be considered when assessing the impact of auditory distraction in the workplace. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward constrained regression manner. The leave-one-out (LOO) test score is used for kernel selection. The jackknife parameter estimator subject to positivity constraint check is used for the parameter estimation of a single parameter at each forward step. As such the proposed approach is simple to implement and the associated computational cost is very low. An illustrative example is employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to that of the classical Parzen window estimate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the classical Parzen window estimate as the target function, the kernel density estimation is formulated as a regression problem and the orthogonal forward regression technique is adopted to construct sparse kernel density estimates. The proposed algorithm incrementally minimises a leave-one-out test error score to select a sparse kernel model, and a local regularisation method is incorporated into the density construction process to further enforce sparsity. The kernel weights are finally updated using the multiplicative nonnegative quadratic programming algorithm, which has the ability to reduce the model size further. Except for the kernel width, the proposed algorithm has no other parameters that need tuning, and the user is not required to specify any additional criterion to terminate the density construction procedure. Two examples are used to demonstrate the ability of this regression-based approach to effectively construct a sparse kernel density estimate with comparable accuracy to that of the full-sample optimised Parzen window density estimate.