984 resultados para Mark-up pricing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper gives an overview of the INEX 2008 Ad Hoc Track. The main goals of the Ad Hoc Track were two-fold. The first goal was to investigate the value of the internal document structure (as provided by the XML mark-up) for retrieving relevant information. This is a continuation of INEX 2007 and, for this reason, the retrieval results are liberalized to arbitrary passages and measures were chosen to fairly compare systems retrieving elements, ranges of elements, and arbitrary passages. The second goal was to compare focused retrieval to article retrieval more directly than in earlier years. For this reason, standard document retrieval rankings have been derived from all runs, and evaluated with standard measures. In addition, a set of queries targeting Wikipedia have been derived from a proxy log, and the runs are also evaluated against the clicked Wikipedia pages. The INEX 2008 Ad Hoc Track featured three tasks: For the Focused Task a ranked-list of nonoverlapping results (elements or passages) was needed. For the Relevant in Context Task non-overlapping results (elements or passages) were returned grouped by the article from which they came. For the Best in Context Task a single starting point (element start tag or passage start) for each article was needed. We discuss the results for the three tasks, and examine the relative effectiveness of element and passage retrieval. This is examined in the context of content only (CO, or Keyword) search as well as content and structure (CAS, or structured) search. Finally, we look at the ability of focused retrieval techniques to rank articles, using standard document retrieval techniques, both against the judged topics as well as against queries and clicks from a proxy log.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The number of bidders, N, involved in a construction procurement auction is known to have an important effect on the value of the lowest bid and the mark up applied by bidders. In practice, for example, it is important for a bidder to have a good estimate of N when bidding for a current contract. One approach, instigated by Friedman in 1956, is to make such an estimate by statistical analysis and modelling. Since then, however, finding a suitable model for N has been an enduring problem for researchers and, despite intensive research activity in the subsequent thirty years little progress has been made - due principally to the absence of new ideas and perspectives. This paper resumes the debate by checking old assumptions, providing new evidence relating to concomitant variables and proposing a new model. In doing this and in order to assure universality, a novel approach is developed and tested by using a unique set of twelve construction tender databases from four continents. This shows the new model provides a significant advancement on previous versions. Several new research questions are also posed and other approaches identified for future study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Credence goods markets suffer from inefficiencies caused by superior information of sellers about the surplus-maximising quality. While standard theory predicts that equal mark-up prices solve the credence goods problem if customers can verify the quality received, experimental evidence indicates the opposite. We identify a lack of robustness with respect to heterogeneity in social preferences as a possible cause of this and conduct new experiments that allow for parsimonious identification of sellers’ social preference types. Our results confirm the assumed heterogeneity in social preferences and provide strong support for our explanation of the failure of verifiability to increase efficiency.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

To interpret the temporal information on texts, a mark-up language that will code that information is needed, in order to make that information automatically reachable. The most used mark-up language is TimeML (Pustejovsky et al., 2003), which has also been choosen for Basque. In this guidelines we present the Basque version of ISO-TimeML (ISO-TimeML working group, 2008). After having analysed the tags, attributes and values created for English, we describe the most appropriate ones to represent Basque time structures’ information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[EN]To interpret the temporal information on texts, a mark-up language that will code that information is needed, in order to make that information automatically reachable. The most used mark-up language is TimeML (Pustejovsky et al., 2003), which has also been choosen for Basque. In this guidelines we present the Basque version of ISO-TimeML (ISO-TimeML working group, 2008). After having analysed the tags, attributes and values created for English, we describe the most appropriate ones to represent Basque time structures’ information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The mobile cloud computing paradigm can offer relevant and useful services to the users of smart mobile devices. Such public services already exist on the web and in cloud deployments, by implementing common web service standards. However, these services are described by mark-up languages, such as XML, that cannot be comprehended by non-specialists. Furthermore, the lack of common interfaces for related services makes discovery and consumption difficult for both users and software. The problem of service description, discovery, and consumption for the mobile cloud must be addressed to allow users to benefit from these services on mobile devices. This paper introduces our work on a mobile cloud service discovery solution, which is utilised by our mobile cloud middleware, Context Aware Mobile Cloud Services (CAMCS). The aim of our approach is to remove complex mark-up languages from the description and discovery process. By means of the Cloud Personal Assistant (CPA) assigned to each user of CAMCS, relevant mobile cloud services can be discovered and consumed easily by the end user from the mobile device. We present the discovery process, the architecture of our own service registry, and service description structure. CAMCS allows services to be used from the mobile device through a user's CPA, by means of user defined tasks. We present the task model of the CPA enabled by our solution, including automatic tasks, which can perform work for the user without an explicit request.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: Molecular pathology relies on identifying anomalies using PCR or analysis of DNA/RNA. This is important in solid tumours where molecular stratification of patients define targeted treatment. These molecular biomarkers rely on examination of tumour, annotation for possible macro dissection/tumour cell enrichment and the estimation of % tumour. Manually marking up tumour is error prone. Method: We have developed a method for automated tumour mark-up and % cell calculations using image analysis called TissueMark® based on texture analysis for lung, colorectal and breast (cases=245, 100, 100 respectively). Pathologists marked slides for tumour and reviewed the automated analysis. A subset of slides was manually counted for tumour cells to provide a benchmark for automated image analysis. Results: There was a strong concordance between pathological and automated mark-up (100 % acceptance rate for macro-dissection). We also showed a strong concordance between manually/automatic drawn boundaries (median exclusion/inclusion error of 91.70 %/89 %). EGFR mutation analysis was precisely the same for manual and automated annotation-based macrodissection. The annotation accuracy rates in breast and colorectal cancer were 83 and 80 % respectively. Finally, region-based estimations of tumour percentage using image analysis showed significant correlation with actual cell counts. Conclusion: Image analysis can be used for macro-dissection to (i) annotate tissue for tumour and (ii) estimate the % tumour cells and represents an approach to standardising/improving molecular diagnostics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Editorial presents the focus, scope and policies of the inaugural issue of Nature Conservation, a new open access, peer-reviewed journal bridging natural sciences, social sciences and hands-on applications in conservation management. The journal covers all aspects of nature conservation and aims particularly at facilitating better interaction between scientists and practitioners. The journal will impose no restrictions on manuscript size or the use of colour. We will use an XML-based editorial workflow and several cutting-edge innovations in publishing and information dissemination. These include semantic mark-up of, and enhancements to published text, data, and extensive cross-linking within the journal and to external sources. We believe the journal will make an important contribution to better linking science and practice, offers rapid, peer-reviewed and flexible publication for authors and unrestricted access to content.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The number of bidders, N, involved in a construction procurement auction is known to have an important effect on the value of the lowest bid and the mark-up applied by bidders. In practice, for example, it is important for a bidder to have a good estimate of N when bidding for a current contract. One approach, instigated by Friedman in 1956, is to make such an estimate by statistical analysis and modelling. Since then, however, finding a suitable model for N has been an enduring problem for researchers and, despite intensive research activity in the subsequent 30 years, little progress has been made, due principally to the absence of new ideas and perspectives. The debate is resumed by checking old assumptions, providing new evidence relating to concomitant variables and proposing a new model. In doing this and in order to ensure universality, a novel approach is developed and tested by using a unique set of 12 construction tender databases from four continents. This shows the new model provides a significant advancement on previous versions. Several new research questions are also posed and other approaches identified for future study.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Esta tese de doutorado está composta por quatro ensaios em macroeconometria e finanças com aplicações nas áreas de abertura comercial, custo de bem estar do ciclo de negócios e taxas de juros. No primeiro ensaio analisamos o comportamento da indústria de transformação após as reformas implantadas na década de noventa. Verificamos se o processo de abertura gerou aumentos da produtividade média da indústria de transformação. Adicionalmente, estimamos o mark-up de diferentes setores industriais e testamos se este se modifica após a abertura comercial. Os resultados das estimações indicam a existência de um significativo aumento na produtividade industrial na maior parte dos setores estudados. O canal para este aumento de produtividade, aparentemente, não é o aumento da concorrência, já que não há evidência estatística de redução de mark-up. Este é talvez o resultado mais surpreendente do artigo, o fato de que o mark-up não se modificar significativamente após a abertura comercial. Os setores estimados como não concorrenciais antes da abertura continuaram a ser depois dela. Acesso a insumo importados e uso de novas tecnologias podem ser possíveis canais de aumento de produtividade. Este resultado está em desacordo com Moreira (1999) que constrói diretamente dos dados medidas de mark-up. No segundo ensaio testamos a Hipótese das Expectativas Racionais (HER) para a estrutura a termo brasileira. Examinamos várias combinações de prazos entre 1 dia e 1 ano, seguindo a metodologia adotada pelo Banco Central do Brasil, para o período de Julho de 1996 a Dezembro de 2001. Mostramos que: (i) os coeficientes estimados dos diferenciais de rendimento entre as taxas longa e curta (yield spreads) nas equações de mudança de curto prazo da taxa longa e nas equações de mudança de longo prazo da taxa curta são imprecisos e incapazes de rejeitarem a HER; e (ii) diferenciais de rendimento altamente correlacionados com as previsões de expectativas racionais das futuras mudanças das taxas curtas, mas significativamente mais voláteis que estas últimas, sugerem a rejeição da HER. A hipótese alternativa de reação exagerada (overreaction) do diferencial de rendimento em relação à expectativa das futuras variações da taxa curta parece uma explicação razoável para as evidências, com implicações para a política monetária e para a gestão de investimentos. No terceiro ensaio estudamos o custo de bem-estar dos ciclos de negócios. Robert Lucas (1987) mostrou um resultado surpreendente para a literatura de ciclos de negócios, o custo de bem-estar, por ele calculado, é muito pequeno (US$ 8,50 por ano). Modelamos as preferências por funções com elasticidade de substituição constante e uma forma reduzida para o consumo razoável. Construímos dados seculares para a economia americana e computamos o custo de bem-estar para dois períodos distintos, pré e pós-segunda guerra mundial, usando três formas alternativas de decomposição tendência-ciclo, com foco na decomposição de Beveridge-Nelson. O período pós-guerra foi calmo, com um custo de bem-estar que raramente ultrapassa 1% do consumo per-capita (US$ 200,00 por ano). Para o período pré-guerra há uma alteração drástica nos resultados, se utilizamos a decomposição de Beveridge-Nelson encontramos uma compensação de 5% do consumo per-capita (US$ 1.000,00 por ano) com parâmetros de preferências e desconto intertemporal razoáveis. Mesmo para métodos alternativos, como o modelo com tendência linear, encontramos um custo de bem estar de 2% do consumo per-capita (US$ 400,00 por ano). Deste estudo podemos concluir: (i) olhando para dados pós-guerra, o custo de bem-estar dos ciclos de negócios marginal é pequeno, o que depõe contra a intensificação de políticas anticíclicas, sendo que do ponto de vista do consumidor pré-segunda guerra este custo é considerável; e (ii) o custo de bem-estar dos ciclos de negócios caiu de 5% para 0.3% do consumo per-capita, do período pré para o período pós-guerra, se esta redução é resultado de políticas anticíclicas, estas políticas foram muito bem sucedidas. Por último, no quarto ensaio analisamos o comportamento da taxa de juros livre de risco - cupom cambial - na economia brasileira para o período de 20 de janeiro de 1999 a 30 de julho de 2003. Identificamos os componentes de curto e longo prazo de três medidas de taxa de retorno, as quais foram submetidas aos tratamentos econométricos propostos em Vahid e Engle (1993) e Proietti (1997). Os resultados sugerem a convergência das taxas de retorno para um equilíbrio de longo prazo. Identificamos a dominância do componente de longo prazo na determinação da trajetória do Prêmio do C-BOND e do componente de curto prazo no caso do Prêmio do Swap Cambial. Já para o Prêmio Descoberto de Juros não conseguimos identificar o domínio de qualquer componente. Associando o componente de longo prazo aos fundamentos da economia e os componentes de curto prazo a choques nominais, poderíamos dizer que, em termos relativos, o Prêmio do C-BOND estaria mais fortemente ligado aos fundamentos e o Prêmio do Swap Cambial a choques nominais.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Trata da participação do trabalho no valor agregado da indústria de trasnformação brasileira. A paritr da teoria de distribuição de kalecki, levantam-se algumas hipóteses que são testadas empiricamente. Procura demosntrar que, no período em exame, o mark-up, a relação entre os custos com matérias-primas e os salários e a composição industrial determinaram a evolução da participação do trabalho

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This work aims to analyze the interaction and the effects of administered prices in the economy, through a DSGE model and the derivation of optimal monetary policies. The model used is a standard New Keynesian DSGE model of a closed economy with two sectors companies. In the first sector, free prices, there is a continuum of firms, and in the second sector of administered prices, there is a single firm. In addition, the model has positive trend inflation in the steady state. The model results suggest that price movements in any sector will impact on both sectors, for two reasons. Firstly, the price dispersion causes productivity to be lower. As the dispersion of prices is a change in the relative price of any sector, relative to general prices in the economy, when a movement in the price of a sector is not followed by another, their relative weights will change, leading to an impact on productivity in both sectors. Second, the path followed by the administered price sector is considered in future inflation expectations, which is used by companies in the free sector to adjust its optimal price. When this path leads to an expectation of higher inflation, the free sector companies will choose a higher mark-up to accommodate this expectation, thus leading to higher inflation trend when there is imperfect competition in the free sector. Finally, the analysis of optimal policies proved inconclusive, certainly indicating that there is influence of the adjustment model of administered prices in the definition of optimal monetary policy, but a quantitative study is needed to define the degree of impact.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Includes bibliography

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper uses a theoretical schumpeterian and kaleckian to analyze the dynamic effects of innovation on competitiveness and sectoral functional income distribution. By affecting the mark-up and market power successful innovations allow the expansion of the asymmetries between firms, intensifying competition and promoting mismatches in sectoral income distribution between wages and profits.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study is the creation of a Historical GIS that spatially reference data retrieved from Italian and Catalan historical sources and records. The generation of locates these metasource was achieved through the integral acquisition of source-oriented records and the insertion of mark-up fields, yet maintaining, where possible, the original encoding of the source documents. In order to standardize the set of information contained in the original documents and thus allow queries to the database, additional fields were introduced. Once the initial phase of data research and analysis was concluded the new virtual source was published online within an open WebGIS source. As a conclusion we have created a dynamic and spatially referenced database of geo-historical information. The configuration of this new source is such to guarantee the best possible accessibility.