73 resultados para Dependency parsing


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Improvements in the resolution of satellite imagery have enabled extraction of water surface elevations at the margins of the flood. Comparison between modelled and observed water surface elevations provides a new means for calibrating and validating flood inundation models, however the uncertainty in this observed data has yet to be addressed. Here a flood inundation model is calibrated using a probabilistic treatment of the observed data. A LiDAR guided snake algorithm is used to determine an outline of a flood event in 2006 on the River Dee, North Wales, UK, using a 12.5m ERS-1 image. Points at approximately 100m intervals along this outline are selected, and the water surface elevation recorded as the LiDAR DEM elevation at each point. With a planar water surface from the gauged upstream to downstream water elevations as an approximation, the water surface elevations at points along this flooded extent are compared to their ‘expected’ value. The pattern of errors between the two show a roughly normal distribution, however when plotted against coordinates there is obvious spatial autocorrelation. The source of this spatial dependency is investigated by comparing errors to the slope gradient and aspect of the LiDAR DEM. A LISFLOOD-FP model of the flood event is set-up to investigate the effect of observed data uncertainty on the calibration of flood inundation models. Multiple simulations are run using different combinations of friction parameters, from which the optimum parameter set will be selected. For each simulation a T-test is used to quantify the fit between modelled and observed water surface elevations. The points chosen for use in this T-test are selected based on their error. The criteria for selection enables evaluation of the sensitivity of the choice of optimum parameter set to uncertainty in the observed data. This work explores the observed data in detail and highlights possible causes of error. The identification of significant error (RMSE = 0.8m) between approximate expected and actual observed elevations from the remotely sensed data emphasises the limitations of using this data in a deterministic manner within the calibration process. These limitations are addressed by developing a new probabilistic approach to using the observed data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article explores how data envelopment analysis (DEA), along with a smoothed bootstrap method, can be used in applied analysis to obtain more reliable efficiency rankings for farms. The main focus is the smoothed homogeneous bootstrap procedure introduced by Simar and Wilson (1998) to implement statistical inference for the original efficiency point estimates. Two main model specifications, constant and variable returns to scale, are investigated along with various choices regarding data aggregation. The coefficient of separation (CoS), a statistic that indicates the degree of statistical differentiation within the sample, is used to demonstrate the findings. The CoS suggests a substantive dependency of the results on the methodology and assumptions employed. Accordingly, some observations are made on how to conduct DEA in order to get more reliable efficiency rankings, depending on the purpose for which they are to be used. In addition, attention is drawn to the ability of the SLICE MODEL, implemented in GAMS, to enable researchers to overcome the computational burdens of conducting DEA (with bootstrapping).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is consensus worldwide that the artisanal and small-scale mining (ASM) sector is comprised of individuals who are trapped in a vicious cycle of poverty, lacking the necessary financial and technological means to improve their standards of living. Minimal work, however, has been undertaken to identify the very factors behind miners' plight, which inevitably vary from country to country. This paper uses a case study of Ghana to argue that an increased dependence upon mercury for amalgamation In artisanal gold-mining communities is one such-albeit overlooked-"agent of poverty". There is mounting empirical evidence which suggests that dealings with the monoponistic middlemen who supply mercury, purchases of costly medicines to remedy ailments caused by mercury poisoning, and a lack of appropriate safeguards and alternatives to amalgamation, are preventing gold miners from improving their practices and livelihoods. The solution to the problem lies in breaking this cycle of dependency, which can be achieved by providing miners with robust support services, mercury-free technologies and education. (c) 2006 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A mathematical model describing the main mechanistic processes involved in keratinocyte response to chromium and nickel has been developed and compared to experimental in vitro data. Accounting for the interactions between the metal ions and the keratinocytes, the law of mass action was used to generate ordinary differential equations which predict the time evolution and ion concentration dependency of keratinocyte viability, the amount of metal associated with the keratinocytes and the release of cytokines by the keratinocytes. Good agreement between model predictions and existing experimental data of these endpoints was observed, supporting the use of this model to explore physiochemical parameters that influence the toxicological response of keratinocytes to these two metals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During fatigue tests of cortical bone specimens, at the unload portion of the cycle (zero stress) non-zero strains occur and progressively accumulate as the test progresses. This non-zero strain is hypothesised to be mostly, if not entirely, describable as creep. This work examines the rate of accumulation of this strain and quantifies its stress dependency. A published relationship determined from creep tests of cortical bone (Journal of Biomechanics 21 (1988) 623) is combined with knowledge of the stress history during fatigue testing to derive an expression for the amount of creep strain in fatigue tests. Fatigue tests on 31 bone samples from four individuals showed strong correlations between creep strain rate and both stress and “normalised stress” (σ/E) during tensile fatigue testing (0–T). Combined results were good (r2=0.78) and differences between the various individuals, in particular, vanished when effects were examined against normalised stress values. Constants of the regression showed equivalence to constants derived in creep tests. The universality of the results, with respect to four different individuals of both sexes, shows great promise for use in computational models of fatigue in bone structures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Strategy is a contested concept. The generic literature is characterized by a diverse range of competing theories and alternative perspectives. Traditional models of the competitive strategy of construction firms have tended to focus on exogenous factors. In contrast, the resource-based view of strategic management emphasizes the importance of endogenous factors. The more recently espoused concept of dynamic capabilities extends consideration beyond static resources to focus on the ability of firms to reconfigure their operating routines to enable responses to changing environments. The relevance of the dynamics capabilities framework to the construction sector is investigated through an exploratory case study of a regional contractor. The focus on how firms continuously adapt to changing environments provides new insights into competitive strategy in the construction sector. Strong support is found for the importance of path dependency in shaping strategic choice. The case study further suggests that strategy is a collective endeavour enacted by a loosely defined group of individual actors. Dynamic capabilities are characterized by an empirical elusiveness and as such are best construed as situated practices embedded within a social and physical context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Libya with its strategic location and natural resources stands as a crucial link between the Arab world, Europe, and Africa. The people of Libya have an optimistic outlook with regard to the Libyan economy after the suspension of the United Nations sanctions in 1999 that had been imposed on Libya in 1992, as well as the recent emphasis on privatization from the government. Since then, local and foreign investors have been encouraged to take a more prominent role in order to help privatize some of the state run-industries; the attention to privatization is aimed to help Libya’s economic growth and reduce its heavy dependency on oil revenues. Considering the economic situation, Libya is a rich country. However, it needs to modernize, it needs more and better infrastructure, it needs non-oil based financing, furthermore, it needs to develop a financial model for development and investment from the private sector. Although the Libyan government is working on the improvement of the business environment to make it more attractive for foreign investors in a way to move towards privatization, they have ignored some of the challenges that privatization will be facing in Libya. Privatization can not be implemented overnight. They have taken this for granted without careful consideration of its challenges. This paper attempts to investigate and discuss the challenges that need to be taken into account before privatization of infrastructure projects can be introduced in Libya. This paper is based on interviews with senior technical officials in the government.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Time dependent gas hold-up generated in the 0.3 and 0.6 m diameter vessels using high viscosity castor oil and carboxy methyl cellulose (CMC) solution was compared on the basis of impeller speed (N) and gas velocity (V-G). Two types of hold-up were distinguished-the hold-up due to tiny bubbles (epsilon(ft)) and total hold-up (epsilon(f)), which included large and tiny bubbles. It was noted that vessel diameter (i.e. the scale of operation) significantly influences (i) the trends and the values of epsilon(f) and epsilon(ft), and (ii) the values of tau (a constant reflecting the time dependency of hold-up). The results showed that a scale independent correlation for gas hold-up of the form epsilon(f) or epsilon(ft) = A(N or P-G/V)(a) (V-G)(b), where "a" and "b" are positive constants is not appropriate for viscous liquids. This warrants further investigations into the effect of vessel diameter on gas hold-up in impeller agitated high viscosity liquids (mu or mu(a) > 0.4 Pa s). (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study we explore the impact of a morphological deficit on syntactic comprehension. A self-paced listening task was designed to investigate passive sentence processing in typically developing (TD) children and children with Grammatical-Specific Language Impairment (G-SLI). Participants had to judge whether the sentence they heard matched a picture they were shown. Working within the framework of the Computational Grammatical Complexity Hypothesis, which stresses how different components of the grammar interact, we tested whether children were able to use phonotactic cues to parse reversible passive sentences of the form the X was verbed by Y We predicted that TD children would be able to use phonotactics to parse a form like touched or hugged as a participle, and hence interpret passive sentences correctly. This cue is predicted not be used by G-SLI children, because they have difficulty building complex morphological representations. We demonstrate that indeed TD, but not G-SLI, children are able to use phonotactics cues in parsing passive sentences. (C) 2006 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: The computational grammatical complexity ( CGC) hypothesis claims that children with G(rammatical)-specific language impairment ( SLI) have a domain-specific deficit in the computational system affecting syntactic dependencies involving 'movement'. One type of such syntactic dependencies is filler-gap dependencies. In contrast, the Generalized Slowing Hypothesis claims that SLI children have a domain-general deficit affecting processing speed and capacity. Aims: To test contrasting accounts of SLI we investigate processing of syntactic (filler-gap) dependencies in wh-questions. Methods & Procedures: Fourteen 10; 2 - 17; 2 G-SLI children, 14 age- matched and 17 vocabulary-matched controls were studied using the cross- modal picturepriming paradigm. Outcomes & Results: G-SLI children's processing speed was significantly slower than the age controls, but not younger vocabulary controls. The G- SLI children and vocabulary controls did not differ on memory span. However, the typically developing and G-SLI children showed a qualitatively different processing pattern. The age and vocabulary controls showed priming at the gap, indicating that they process wh-questions through syntactic filler-gap dependencies. In contrast, G-SLI children showed priming only at the verb. Conclusions: The findings indicate that G-SLI children fail to establish reliably a syntactic filler- gap dependency and instead interpret wh-questions via lexical thematic information. These data challenge the Generalized Slowing Hypothesis account, but support the CGC hypothesis, according to which G-SLI children have a particular deficit in the computational system affecting syntactic dependencies involving 'movement'. As effective remediation often depends on aetiological insight, the discovery of the nature of the syntactic deficit, along side a possible compensatory use of semantics to facilitate sentence processing, can be used to direct therapy. However, the therapeutic strategy to be used, and whether such similar strengths and weaknesses within the language system are found in other SLI subgroups are empirical issues that warrant further research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We explored the dependency of the saccadic remote distractor effect (RDE) on the spatial frequency content of target and distractor Gabor patches. A robust RDE was obtained with low-medium spatial frequency distractors, regardless of the spatial frequency of the tat-get. High spatial frequency distractors interfered to a similar extent when the target was of the same spatial frequency. We developed a quantitative model based on lateral inhibition within an oculomotor decision unit. This lateral inhibition mechanism cannot account for the interaction observed between target and distractor spatial frequency, pointing to the existence of channel interactions at an earlier level. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the forecasting of binary events, verification measures that are “equitable” were defined by Gandin and Murphy to satisfy two requirements: 1) they award all random forecasting systems, including those that always issue the same forecast, the same expected score (typically zero), and 2) they are expressible as the linear weighted sum of the elements of the contingency table, where the weights are independent of the entries in the table, apart from the base rate. The authors demonstrate that the widely used “equitable threat score” (ETS), as well as numerous others, satisfies neither of these requirements and only satisfies the first requirement in the limit of an infinite sample size. Such measures are referred to as “asymptotically equitable.” In the case of ETS, the expected score of a random forecasting system is always positive and only falls below 0.01 when the number of samples is greater than around 30. Two other asymptotically equitable measures are the odds ratio skill score and the symmetric extreme dependency score, which are more strongly inequitable than ETS, particularly for rare events; for example, when the base rate is 2% and the sample size is 1000, random but unbiased forecasting systems yield an expected score of around −0.5, reducing in magnitude to −0.01 or smaller only for sample sizes exceeding 25 000. This presents a problem since these nonlinear measures have other desirable properties, in particular being reliable indicators of skill for rare events (provided that the sample size is large enough). A potential way to reconcile these properties with equitability is to recognize that Gandin and Murphy’s two requirements are independent, and the second can be safely discarded without losing the key advantages of equitability that are embodied in the first. This enables inequitable and asymptotically equitable measures to be scaled to make them equitable, while retaining their nonlinearity and other properties such as being reliable indicators of skill for rare events. It also opens up the possibility of designing new equitable verification measures.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cloud radar and lidar can be used to evaluate the skill of numerical weather prediction models in forecasting the timing and placement of clouds, but care must be taken in choosing the appropriate metric of skill to use due to the non- Gaussian nature of cloud-fraction distributions. We compare the properties of a number of different verification measures and conclude that of existing measures the Log of Odds Ratio is the most suitable for cloud fraction. We also propose a new measure, the Symmetric Extreme Dependency Score, which has very attractive properties, being equitable (for large samples), difficult to hedge and independent of the frequency of occurrence of the quantity being verified. We then use data from five European ground-based sites and seven forecast models, processed using the ‘Cloudnet’ analysis system, to investigate the dependence of forecast skill on cloud fraction threshold (for binary skill scores), height, horizontal scale and (for the Met Office and German Weather Service models) forecast lead time. The models are found to be least skillful at predicting the timing and placement of boundary-layer clouds and most skilful at predicting mid-level clouds, although in the latter case they tend to underestimate mean cloud fraction when cloud is present. It is found that skill decreases approximately inverse-exponentially with forecast lead time, enabling a forecast ‘half-life’ to be estimated. When considering the skill of instantaneous model snapshots, we find typical values ranging between 2.5 and 4.5 days. Copyright c 2009 Royal Meteorological Society

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose – While Freeman's stakeholder management approach has attracted much attention from both scholars and practitioners, little empirical work has considered the interconnectedness of organisational perspectives and stakeholder perspectives. The purpose of this paper is to respond to this gap by developing and empirically testing a bi-directional model of organisation/stakeholder relationships. Design/methodology/approach – A conceptual framework is developed that integrates how stakeholders are affected by organisations with how they affect organisations. Quantitative data relating to both sides of the relationship are obtained from 700 customers of a European service organisation and analysed using partial least squares structural equation modelling technique. Findings – The findings provide empirical support for the notion of mutual dependency between organisations and stakeholders as advocated by stakeholder theorists. The results suggest that the way stakeholders relate to organisations is dependent on how organisations relate to stakeholders. Originality/value – The study is original on two fronts: first, it provides a framework and process that can be used by researchers to model bi-directional research with other stakeholder groups and in different contexts. Second, the study presents an example application of bi-directional research by empirically linking organisational and stakeholder expectations in the case of customers of a UK service organisation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PSNCBAM-1 has recently been described as a cannabinoid CB1 receptor allosteric antagonist associated with hypophagic effects in vivo; however, PSNCBAM-1 effects on CB1 ligand-mediated modulation of neuronal excitability remain unknown. Here, we investigate PSNCBAM-1 actions on CB1 receptor-stimulated [35S]GTPγS binding in cerebellar membranes and on CB1 ligand modulation of presynaptic CB1 receptors at inhibitory interneurone-Purkinje cell (IN-PC) synapses in the cerebellum using whole-cell electrophysiology. PSNCBAM-1 caused non-competitive antagonism in [35S]GTPγS binding studies, with higher potency against the CB receptor agonist CP55940 than for WIN55,212-2 (WIN55). In electrophysiological studies, WIN55 and CP55940 reduced miniature inhibitory postsynaptic currents (mIPSCs) frequency, but not amplitude. PSNCBAM-1 application alone had no effect on mIPSCs; however, PSNCBAM-1 pre-treatment revealed agonist-dependent functional antagonism, abolishing CP55940-induced reductions in mIPSC frequency, but having no clear effect on WIN55 actions. The CB1 antagonist/inverse agonist AM251 increased mIPSC frequency beyond control, this effect was reversed by PSNCBAM-1. PSNCBAM-1 pre-treatment also attenuated AM251 effects. Thus, PSNCBAM-1 reduced CB1 receptor ligand functional efficacy in the cerebellum. The differential effect of PSNCBAM-1 on CP55940 versus WIN55 actions in [35S]GTPγS binding and electrophysiological studies and the attenuation of AM251 effects are consistent with the ligand-dependency associated with allosteric modulation. These data provide the first description of functional PSNCBAM-1 allosteric antagonist effects on neuronal excitability in the mammalian CNS. PSNCBAM-1 allosteric antagonism may provide viable therapeutic alternatives to orthosteric CB1 antagonists/inverse agonists in the treatment of CNS disease.