644 resultados para locational disadvantage


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Progress in the Doha Round is assessed against the changes to the common agricultural policy (CAP) brought about by the Fischler reforms of 2003-2004, and that proposed for sugar. An elimination of export subsidies could place EU exports of processed foods at a competitive disadvantage because of high sugar and milk prices. Provided the single payment scheme falls within the green box, the likely new limits on domestic support should not be problematic for the post-Fischler CAP. However, an ambitious market access package could open up EU markets and bring pressure for further reform. If there is no Doha agreement, existing provisions will continue to apply, but without the protection of the Peace Clause; and increased litigation is likely. Further CAP reform is to be expected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The international construction market is very complex, and requires strong structure and strategy for companies wanting to operate overseas. This article investigates characteristics of international construction. The international operation of Brazilian contractors is explored via results from a qualitative study which was carried out using case studies. The case studies comprised ten big Brazilian contractors, six of which operate abroad. The study identified the patterns of international operation, the competitive advantages of these contractors, as well as the difficulties faced by them in the international construction market. Four of the contractors studied operate only in the domestic market. The study of these cases revealed both obstacles and motivations for future international operations. The study revealed that despite the existing competitive advantages, Brazilian contractors' presence in the international market is limited. The main reason for that is probably their main competitive disadvantage: the lack of financial support from the government.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypothesis: The aim of this study was to measure the mass loading effect of an active middle-ear implant (the Vibrant Soundbridge) in cadaver temporal bones. Background: Implantable middle ear hearing devices such as Vibrant Soundbridge have been used as an alternative to conventional hearing aids for the rehabilitation of sensorineural hearing loss. Other than the obvious disadvantage of requiring implantation middle ear surgery, it also applies a direct weight on the ossicular chain which, in turn, may have an impact on residual hearing. Previous studies have shown that applying a mass directly on the ossicular chain has a damping effect on its response to sound. However, little has been done to investigate the magnitude and the frequency characteristics of the mass loading effect in devices such as the Vibrant Soundbridge. Methods: Five fresh cadaver temporal bones were used. The stapes displacement was measured using laser Doppler vibrometry before and after the placement of a Vibrant Sound-bridge floating mass transducer. The effects of mass and attachment site were compared with the unloaded response. Measurements were obtained at frequencies between 0.1 and 10 kHz and at acoustic input levels of 100 dB sound pressure level. Each temporal bone acted as its own control. Results: Placement of the floating mass transducer caused a reduction of the stapes displacement. There were variations between the bones. The change of the stapes displacement varied from 0 dB to 28 dB. The effect was more prominent at frequencies above 1,000 Hz. Placing the floating mass transducer close to the incudostapedial joint reduced the mass loading effect. Conclusion: The floating mass transducer produces a measurable reduction of the stapes displacement in the temporal bone model. The effect is more prominent at high frequencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research has identified associations between indicators of social disadvantage and the presence of child sleep problems. We examined the longitudinal development of infant sleep in families experiencing high (n = 58) or low (n = 64) levels of psychosocial adversity, and the contributions of neonatal self-regulatory capacities and maternal settling strategies to this development. Assessments of infant sleep at 4-, 7-, and 12-weeks postpartum indicated no differences in sleeping difficulties between high- and low-adversity groups. However, more infant sleep difficulties were reported in the high- versus low-adversity groups at 12- and 18-month follow-ups. Neonatal self-regulatory capacities were not related to the presence or absence of adversity, or to subsequent infant sleep quality. However, there were group differences in maternal settling strategies that did predict subsequent infant sleep difficulties. The pattern of sleep disturbance observed in association with maternal psychosocial adversity at 18-months was consistent with risk for broader impairments in child functioning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In 1997, the United Kingdom started the world's first commercial digital terrestrial television service. The system used was the European Digital Video Broadcast - Terrestrial (DVB-T) but due to technological constraints at the time, the system chosen was the 2K system - a system that uses 1705 carriers to convey the digital television services through a hostile terrestrial environment. Today, these constraints are no longer applicable but in order to maintain backwards compatibility to the older set top boxes, the 2K system is still used. The 2K system has the disadvantage of excluding the possibiliiy of employing a Single Frequency Network (SFN) - something that can help minimise the required bandwidth for television services. This paper will demonstrate a computationally inexpensive soft decision Quadrature Amplitude Modulation technique that can reject the multipaths. (1).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We consider problems of splitting and connectivity augmentation in hypergraphs. In a hypergraph G = (V +s, E), to split two edges su, sv, is to replace them with a single edge uv. We are interested in doing this in such a way as to preserve a defined level of connectivity in V . The splitting technique is often used as a way of adding new edges into a graph or hypergraph, so as to augment the connectivity to some prescribed level. We begin by providing a short history of work done in this area. Then several preliminary results are given in a general form so that they may be used to tackle several problems. We then analyse the hypergraphs G = (V + s, E) for which there is no split preserving the local-edge-connectivity present in V. We provide two structural theorems, one of which implies a slight extension to Mader’s classical splitting theorem. We also provide a characterisation of the hypergraphs for which there is no such “good” split and a splitting result concerned with a specialisation of the local-connectivity function. We then use our splitting results to provide an upper bound on the smallest number of size-two edges we must add to any given hypergraph to ensure that in the resulting hypergraph we have λ(x, y) ≥ r(x, y) for all x, y in V, where r is an integer valued, symmetric requirement function on V*V. This is the so called “local-edge-connectivity augmentation problem” for hypergraphs. We also provide an extension to a Theorem of Szigeti, about augmenting to satisfy a requirement r, but using hyperedges. Next, in a result born of collaborative work with Zoltán Király from Budapest, we show that the local-connectivity augmentation problem is NP-complete for hypergraphs. Lastly we concern ourselves with an augmentation problem that includes a locational constraint. The premise is that we are given a hypergraph H = (V,E) with a bipartition P = {P1, P2} of V and asked to augment it with size-two edges, so that the result is k-edge-connected, and has no new edge contained in some P(i). We consider the splitting technique and describe the obstacles that prevent us forming “good” splits. From this we deduce results about which hypergraphs have a complete Pk-split. This leads to a minimax result on the optimal number of edges required and a polynomial algorithm to provide an optimal augmentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper first points out the important fact that the rectangle formulas of continuous convolution discretization, which was widely used in conventional digital deconvolution algorithms, can result in zero-time error. Then, an improved digital deconvolution equation is suggested which is equivalent to the trapezoid formulas of continuous convolution discretization and can overcome the disadvantage of conventional equation satisfactorily. Finally, a simulation in computer is given, thus confirming the theoretical result.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Researchers in the rehabilitation engineering community have been designing and developing a variety of passive/active devices to help persons with limited upper extremity function to perform essential daily manipulations. Devices range from low-end tools such as head/mouth sticks to sophisticated robots using vision and speech input. While almost all of the high-end equipment developed to date relies on visual feedback alone to guide the user providing no tactile or proprioceptive cues, the “low-tech” head/mouth sticks deliver better “feel” because of the inherent force feedback through physical contact with the user's body. However, the disadvantage of a conventional head/mouth stick is that it can only function in a limited workspace and the performance is limited by the user's strength. It therefore seems reasonable to attempt to develop a system that exploits the advantages of the two approaches: the power and flexibility of robotic systems with the sensory feedback of a headstick. The system presented in this paper reflects the design philosophy stated above. This system contains a pair of master-slave robots with the master being operated by the user's head and the slave acting as a telestick. Described in this paper are the design, control strategies, implementation and performance evaluation of the head-controlled force-reflecting telestick system.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The reduction of portfolio risk is important to all investors but is particularly important to real estate investors as most property portfolios are generally small. As a consequence, portfolios are vulnerable to a significant risk of under-performing the market, or a target rate of return and so investors may be exposing themselves to greater risk than necessary. Given the potentially higher risk of underperformance from owning only a few properties, we follow the approach of Vassal (2001) and examine the benefits of holding more properties in a real estate portfolio. Using Monte Carlo simulation and the returns from 1,728 properties in the IPD database, held over the 10-year period from 1995 to 2004, the results show that increases in portfolio size offers the possibility of a more stable and less volatile return pattern over time, i.e. down-side risk is diminished with increasing portfolio size. Nonetheless, increasing portfolio size has the disadvantage of restricting the probability of out-performing the benchmark index by a significant amount. In other words, although increasing portfolio size reduces the down-side risk in a portfolio, it also decreases its up-side potential. Be that as it may, the results provide further evidence that portfolios with large numbers of properties are always preferable to portfolios of a smaller size.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background Efficient gene expression involves a trade-off between (i) premature termination of protein synthesis; and (ii) readthrough, where the ribosome fails to dissociate at the terminal stop. Sense codons that are similar in sequence to stop codons are more susceptible to nonsense mutation, and are also likely to be more susceptible to transcriptional or translational errors causing premature termination. We therefore expect this trade-off to be influenced by the number of stop codons in the genetic code. Although genetic codes are highly constrained, stop codon number appears to be their most volatile feature. Results In the human genome, codons readily mutable to stops are underrepresented in coding sequences. We construct a simple mathematical model based on the relative likelihoods of premature termination and readthrough. When readthrough occurs, the resultant protein has a tail of amino acid residues incorrectly added to the C-terminus. Our results depend strongly on the number of stop codons in the genetic code. When the code has more stop codons, premature termination is relatively more likely, particularly for longer genes. When the code has fewer stop codons, the length of the tail added by readthrough will, on average, be longer, and thus more deleterious. Comparative analysis of taxa with a range of stop codon numbers suggests that genomes whose code includes more stop codons have shorter coding sequences. Conclusions We suggest that the differing trade-offs presented by alternative genetic codes may result in differences in genome structure. More speculatively, multiple stop codons may mitigate readthrough, counteracting the disadvantage of a higher rate of nonsense mutation. This could help explain the puzzling overrepresentation of stop codons in the canonical genetic code and most variants.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is well known that gut bacteria contribute significantly to the host homeostasis, providing a range of benefits such as immune protection and vitamin synthesis. They also supply the host with a considerable amount of nutrients, making this ecosystem an essential metabolic organ. In the context of increasing evidence of the link between the gut flora and the metabolic syndrome, understanding the metabolic interaction between the host and its gut microbiota is becoming an important challenge of modern biology.1-4 Colonization (also referred to as normalization process) designates the establishment of micro-organisms in a former germ-free animal. While it is a natural process occurring at birth, it is also used in adult germ-free animals to control the gut floral ecosystem and further determine its impact on the host metabolism. A common procedure to control the colonization process is to use the gavage method with a single or a mixture of micro-organisms. This method results in a very quick colonization and presents the disadvantage of being extremely stressful5. It is therefore useful to minimize the stress and to obtain a slower colonization process to observe gradually the impact of bacterial establishment on the host metabolism. In this manuscript, we describe a procedure to assess the modification of hepatic metabolism during a gradual colonization process using a non-destructive metabolic profiling technique. We propose to monitor gut microbial colonization by assessing the gut microbial metabolic activity reflected by the urinary excretion of microbial co-metabolites by 1H NMR-based metabolic profiling. This allows an appreciation of the stability of gut microbial activity beyond the stable establishment of the gut microbial ecosystem usually assessed by monitoring fecal bacteria by DGGE (denaturing gradient gel electrophoresis).6 The colonization takes place in a conventional open environment and is initiated by a dirty litter soiled by conventional animals, which will serve as controls. Rodents being coprophagous animals, this ensures a homogenous colonization as previously described.7 Hepatic metabolic profiling is measured directly from an intact liver biopsy using 1H High Resolution Magic Angle Spinning NMR spectroscopy. This semi-quantitative technique offers a quick way to assess, without damaging the cell structure, the major metabolites such as triglycerides, glucose and glycogen in order to further estimate the complex interaction between the colonization process and the hepatic metabolism7-10. This method can also be applied to any tissue biopsy11,12.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An essential aspect of school effectiveness theory is the shift from the social to the organisational context, from the macro- to the micro-culture. The school is represented largely as a bounded institution, set apart, but also in a precarious relationship with the broader social context. It is ironic that at a time when social disadvantage appears to be increasing in Britain and elsewhere, school effectiveness theory places less emphasis on poverty, deprivation and social exclusion. Instead, it places more emphasis on organisational factors such as professional leadership, home/school partnerships, the monitoring of academic progress, shared vision and goals. In this article, the authors evaluate the extent to which notions of effectiveness have displaced concerns about equity in theories of educational change. They explore the extent to which the social structures of gender, ethnicity, sexualities, special needs, social class, poverty and other historical forms of inequality have been incorporated into or distorted and excluded from effectiveness thinking.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Drawing upon European industry and country case studies, this paper investigates the scope and drivers of cross-border real estate development. It is argued that the real estate development process encompasses a diverse range of activities and actors. It is inherently localised, the production process is complex and emphermal, and the outputs are heterogeneous. It analyses a transactions database of European real estate markets to provide insights into the extent of, and variations in, market penetration by non-domestic real estate developers. The data were consistent with the expectation that non-domestic real estate developers from mature markets would have a high level of market penetration in immature markets. Compared to western European markets, the CEE real estate office sales by developers were dominated by US, Israeli and other EU developers. This pattern is consistent with the argument that non-domestic developers have substantial Dunning-type ownership advantages when entering immature real estate markets. However, the data also suggested some unexpected patterns. Relative to their GDP, Austria, Belgium, Denmark, Sweden, Netherlands and Israel accounted for large proportions of sales by developers. All are EU countries (except Israel) with small, open, affluent, highly traded economies. Further, the data also indicate that there may be a threshold where locational disadvantages outweigh ownership advantages and deter cross-border real estate development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper analyses the appraisal of a specialized form of real estate - data centres - that has a unique blend of locational, physical and technological characteristics that differentiate it from conventional real estate assets. Market immaturity, limited trading and a lack of pricing signals enhance levels of appraisal uncertainty and disagreement relative to conventional real estate assets. Given the problems of applying standard discounted cash flow, an approach to appraisal is proposed that uses pricing signals from traded cash flows that are similar to the cash flows generated from data centres. Based upon ‘the law of one price’, it is assumed that two assets that are expected to generate identical cash flows in the future must have the same value now. It is suggested that the expected cash flow of assets should be analysed over the life cycle of the building. Corporate bond yields are used to provide a proxy for the appropriate discount rates for lease income. Since liabilities are quite diverse, a number of proxies are suggested as discount and capitalisation rates including indexed-linked, fixed interest and zero-coupon bonds.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Adaptive methods which “equidistribute” a given positive weight function are now used fairly widely for selecting discrete meshes. The disadvantage of such schemes is that the resulting mesh may not be smoothly varying. In this paper a technique is developed for equidistributing a function subject to constraints on the ratios of adjacent steps in the mesh. Given a weight function $f \geqq 0$ on an interval $[a,b]$ and constants $c$ and $K$, the method produces a mesh with points $x_0 = a,x_{j + 1} = x_j + h_j ,j = 0,1, \cdots ,n - 1$ and $x_n = b$ such that\[ \int_{xj}^{x_{j + 1} } {f \leqq c\quad {\text{and}}\quad \frac{1} {K}} \leqq \frac{{h_{j + 1} }} {{h_j }} \leqq K\quad {\text{for}}\, j = 0,1, \cdots ,n - 1 . \] A theoretical analysis of the procedure is presented, and numerical algorithms for implementing the method are given. Examples show that the procedure is effective in practice. Other types of constraints on equidistributing meshes are also discussed. The principal application of the procedure is to the solution of boundary value problems, where the weight function is generally some error indicator, and accuracy and convergence properties may depend on the smoothness of the mesh. Other practical applications include the regrading of statistical data.