902 resultados para Distinguishing guise


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ce mémoire analyse la notion de fonctionnalité. D'origine jurisprudentielle, ce concept tend à maintenir la dichotomie traditionnelle entre le régime des marques de commerce et celui des brevets. À la lecture des jugements rendus en la matière, le maintien d'une telle dichotomie empêcherait notamment de prolonger indûment un monopole échu sous le régime des brevets par l'enregistrement d'une marque de commerce. Cette étude tente de mieux cerner le concept de fonctionnalité et, plus précisément, de justifier son existence. Pour ce faire, une étude approfondie des régimes des marques de commerce et des brevets, nous permet de comprendre que chacun de ces corps de règles répond à une logique différente. Les fonctions des marques de commerce et des brevets sont en effet distinctes et aucun chevauchement ne semble être permis. Cette situation est d'ailleurs propre à ces régimes spécifiques. Un examen de l'étendue de la notion de fonctionnalité nous permet de constater que d'autres droits de propriété intellectuelle peuvent coexister. À titre d'exemple, nous croyons qu'une intersection est possible entre les régimes des dessins industriels et des marques de commerce. À l'issue de ces recherches, nous constatons que la notion de fonctionnalité est un principe jurisprudentiel bien établi en droit canadien visant à empêcher tout renouvellement à perpétuité d'un brevet par le biais du droit des marques de commerce. L'existence de ce principe nous semble être justifiée en matière de marques de commerce et de brevets. Cette conclusion pourrait toutefois différer dans le cadre d'autres droits de propriété intellectuelle, les fonctions de ces autres régimes semblant permettre des chevauchements.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is increasing agreement that understanding complexity is important for project management because of difficulties associated with decision-making and goal attainment which appear to stem from complexity. However the current operational definitions of complex projects, based upon size and budget, have been challenged and questions have been raised about how complexity can be measured in a robust manner that takes account of structural, dynamic and interaction elements. Thematic analysis of data from 25 in-depth interviews of project managers involved with complex projects, together with an exploration of the literature reveals a wide range of factors that may contribute to project complexity. We argue that these factors contributing to project complexity may define in terms of dimensions, or source characteristics, which are in turn subject to a range of severity factors. In addition to investigating definitions and models of complexity from the literature and in the field, this study also explores the problematic issues of ‘measuring’ or assessing complexity. A research agenda is proposed to further the investigation of phenomena reported in this initial study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many product categories of durable goods such as TV, PC, and DVD player, the largest component of sales is generated by consumers replacing existing units. Aggregate sales models proposed by diffusion of innovation researchers for the replacement component of sales have incorporated several different replacement distributions such as Rayleigh, Weibull, Truncated Normal and Gamma. Although these alternative replacement distributions have been tested using both time series sales data and individual-level actuarial “life-tables” of replacement ages, there is no census on which distributions are more appropriate to model replacement behaviour. In the current study we are motivated to develop a new “modified gamma” distribution by two reasons. First we recognise that replacements have two fundamentally different drivers – those forced by failure and early, discretionary replacements. The replacement distribution for each of these drivers is expected to be quite different. Second, we observed a poor fit of other distributions to out empirical data. We conducted a survey of 8,077 households to empirically examine models of replacement sales for six electronic consumer durables – TVs, VCRs, DVD players, digital cameras, personal and notebook computers. This data allows us to construct individual-level “life-tables” for replacement ages. We demonstrate the new modified gamma model fits the empirical data better than existing models for all six products using both a primary and a hold-out sample.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The "standard" procedure for calibrating the Vesuvio eV neutron spectrometer at the ISIS neutron source, forming the basis for data analysis over at least the last decade, was recently documented in considerable detail by the instrument’s scientists. Additionally, we recently derived analytic expressions of the sensitivity of recoil peak positions with respect to fight-path parameters and presented neutron–proton scattering results that together called in to question the validity of the "standard" calibration. These investigations should contribute significantly to the assessment of the experimental results obtained with Vesuvio. Here we present new results of neutron–deuteron scattering from D2 in the backscattering angular range (theata > 90 degrees) which are accompanied by a striking energy increase that violates the Impulse Approximation, thus leading unequivocally the following dilemma: (A) either the "standard" calibration is correct and then the experimental results represent a novel quantum dynamical effect of D which stands in blatant contradiction of conventional theoretical expectations; (B) or the present "standard" calibration procedure is seriously deficient and leads to artificial outcomes. For Case(A), we allude to the topic of attosecond quantumdynamical phenomena and our recent neutron scattering experiments from H2 molecules. For Case(B),some suggestions as to how the "standard" calibration could be considerably improved are made.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mathematical descriptions of birth–death–movement processes are often calibrated to measurements from cell biology experiments to quantify tissue growth rates. Here we describe and analyze a discrete model of a birth–death-movement process applied to a typical two–dimensional cell biology experiment. We present three different descriptions of the system: (i) a standard mean–field description which neglects correlation effects and clustering; (ii) a moment dynamics description which approximately incorporates correlation and clustering effects, and; (iii) averaged data from repeated discrete simulations which directly incorporates correlation and clustering effects. Comparing these three descriptions indicates that the mean–field and moment dynamics approaches are valid only for certain parameter regimes, and that both these descriptions fail to make accurate predictions of the system for sufficiently fast birth and death rates where the effects of spatial correlations and clustering are sufficiently strong. Without any method to distinguish between the parameter regimes where these three descriptions are valid, it is possible that either the mean–field or moment dynamics model could be calibrated to experimental data under inappropriate conditions, leading to errors in parameter estimation. In this work we demonstrate that a simple measurement of agent clustering and correlation, based on coordination number data, provides an indirect measure of agent correlation and clustering effects, and can therefore be used to make a distinction between the validity of the different descriptions of the birth–death–movement process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many cell types form clumps or aggregates when cultured in vitro through a variety of mechanisms including rapid cell proliferation, chemotaxis, or direct cell-to-cell contact. In this paper we develop an agent-based model to explore the formation of aggregates in cultures where cells are initially distributed uniformly, at random, on a two-dimensional substrate. Our model includes unbiased random cell motion, together with two mechanisms which can produce cell aggregates: (i) rapid cell proliferation, and (ii) a biased cell motility mechanism where cells can sense other cells within a finite range, and will tend to move towards areas with higher numbers of cells. We then introduce a pair-correlation function which allows us to quantify aspects of the spatial patterns produced by our agent-based model. In particular, these pair-correlation functions are able to detect differences between domains populated uniformly at random (i.e. at the exclusion complete spatial randomness (ECSR) state) and those where the proliferation and biased motion rules have been employed - even when such differences are not obvious to the naked eye. The pair-correlation function can also detect the emergence of a characteristic inter-aggregate distance which occurs when the biased motion mechanism is dominant, and is not observed when cell proliferation is the main mechanism of aggregate formation. This suggests that applying the pair-correlation function to experimental images of cell aggregates may provide information about the mechanism associated with observed aggregates. As a proof of concept, we perform such analysis for images of cancer cell aggregates, which are known to be associated with rapid proliferation. The results of our analysis are consistent with the predictions of the proliferation-based simulations, which supports the potential usefulness of pair correlation functions for providing insight into the mechanisms of aggregate formation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aim. This paper is a report of a development and validation of a new job performance scale based on an established job performance model. Background. Previous measures of nursing quality are atheoretical and fail to incorporate the complete range of behaviours performed. Thus, an up-to-date measure of job performance is required for assessing nursing quality. Methods. Test construction involved systematic generation of test items using focus groups, a literature review, and an expert review of test items. A pilot study was conducted to determine the multidimensional nature of the taxonomy and its psychometric properties. All data were collected in 2005. Findings. The final version of the nursing performance taxonomy included 41 behaviours across eight dimensions of job performance. Results from preliminary psychometric investigations suggest that the nursing performance scale has good internal consistency, good convergent validity and good criterion validity. Conclusion. The findings give preliminary support for a new job performance scale as a reliable and valid tool for assessing nursing quality. However, further research using a larger sample and nurses from a broader geographical region is required to cross-validate the measure. This scale may be used to guide hospital managers regarding the quality of nursing care within units and to guide future research in the area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a distinguishing attack against SOBER-128 with linear masking. We found a linear approximation which has a bias of 2^− − 8.8 for the non-linear filter. The attack applies the observation made by Ekdahl and Johansson that there is a sequence of clocks for which the linear combination of some states vanishes. This linear dependency allows that the linear masking method can be applied. We also show that the bias of the distinguisher can be improved (or estimated more precisely) by considering quadratic terms of the approximation. The probability bias of the quadratic approximation used in the distinguisher is estimated to be equal to O(2^− − 51.8), so that we claim that SOBER-128 is distinguishable from truly random cipher by observing O(2^103.6) keystream words.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Kimberlite drill core from the Muskox pipe (Northern Slave Province, Nunavut, Canada) highlights the difficulties in distinguishing coherent from fragmental kimberlite and assessing the volcanological implications of the apparent gradational contact between the two facies. Using field log data, petrography, and several methods to quantify crystal and xenolith sizes and abundances, the pipe is divided into two main facies, dark-coloured massive kimberlite (DMK) and light-coloured fragmental kimberlite (LFK). DMK is massive and homogeneous, containing country-rock lithic clasts (~ 10%) and olivine macrocrysts (~ 15%) set in a dark, typically well crystallised, interstitial medium containing abundant microphenocrysts of olivine (~ 15%), opaques and locally monticellite, all of which are enclosed by mostly serpentine. In general, LFK is also massive and structureless, containing ~ 20% country-rock lithic clasts and ~ 12% olivine macrocrysts. These framework components are supported in a matrix of serpentinized olivine microphenocrysts (10%), microlites of clinopyroxene, and phlogopite, all of which are enclosed by serpentine. The contact between DMK and LFK facies is rarely sharp, and more commonly is gradational (from 5 cm to ~ 10 m). The contact divides the pipe roughly in half and is sub-vertical with an irregular shape, locally placing DMK facies both above and below the fragmental rocks. Most features of DMK are consistent with a fragmental origin, particularly the crystal- and xenolith-rich nature (~ 55-65%), but there are some similarities with rocks described as coherent kimberlite in the literature. We discuss possible origins of gradational contacts and consider the significance for understanding the origin of the DMK facies, with an emphasis on the complications of alteration overprinting of primary textures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss rst a method of measuring polarisation at the ILC using the 1{prong hadronic decays of the . We then show in this contribution how a study of the ~sector and particularly use of decay polarisation can oer a very good handle for distinguishing between mSUGRA and a SUSY-GUTs scenario, both of which can give rise to appropriate Dark Matter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The NO2 center dot center dot center dot I supramolecular synthon is a halogen bonded recognition pattern that is present in the crystal structures of many compounds that contain these functional groups. These synthons have been previously distinguished as P, Q, and R types using topological and geometrical criteria. A five step IR spectroscopic sequence is proposed here to distinguish between these synthon types in solid samples. Sets of known compounds that contain the P, Q, and R synthons are first taken to develop IR spectroscopic identifiers for them. The identifiers are then used to create graded IR filters that sieve the synthons. These filters contain signatures of the individual NO2 center dot center dot center dot I synthons and may be applied to distinguish between P, Q, and R synthon varieties. They are also useful to identify synthons that are of a borderline character, synthons in disordered structures wherein the crystal structure in itself is not sufficient to distinguish synthon types, and in the identification of the NO2 center dot center dot center dot I synthons in compounds with unknown crystal structures. This study establishes clear differences for the three different geometries P, Q, and Rand in the chemical differences in the intermolecular interactions contained in the synthons. Our IR method can be conveniently employed when single crystals are not readily available also in high throughput analysis. It is possible that such identification may also be adopted as an input for crystal structure prediction analysis of compounds with unknown crystal structures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two biotypes of hydrilla [Hydrilla verticillata(L.f.) Royle] occur in the United States, a dioecious type centered in the southeast and a monoecious type in the central Atlantic and northeastern states. Ecosystem managers need tools to distinguish the types as the ranges of each type expand and begin to overlap. A molecular tool using the randomly amplified polymorphic DNA (RAPD) procedure is available but its use is limited by a need for reference samples. We describe an alternative molecular tool which uses “universal primers” to sequence the trnL intron and trnL-F intergenic spacer of the chloroplast genome. This sequence yields three differences between the biotypes (two gaps and one single nucleotide polymorphism). A primer has been designed which ends in a gap that shows up only in the dioecious plant. A polymerase chain reaction (PCR) using this primer produces a product for the monoecious but not the dioecious plant.