947 resultados para Specifications


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: Por antonomasia, el caballero es andante. Dicha característica lo convierte en un viajero que recorre geografías reales y ficticias. De este modo, el libro de caballerías teje sus redes con la literatura de viajes. Podría pensarse que con respecto a este género, la geografía queda supeditada al mundo de lo fantástico; sin embargo, en este trabajo se pretende revisar cuáles son los espacios transitados por el personaje principal de los Cuatro libros de Amadís de Gaula, para saber si, dentro de la obra, existe una geografía de valor simbólico que permita construir un contexto más realista. Así, una serie de especificidades en la descripción del espacio permitiría que haya concordancias entre el mundo literario y el imaginario cultural del lector, lo que haría del Amadís una obra en la que Rodríguez de Montalvo muestra una entidad espacial conocida por sus lectores para dar comienzo a uno de los géneros literarios más exitosos del siglo XVI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Resumen: El Ms. 17.806 de la Biblioteca Nacional de Madrid, titulado Descripción y destrucion de la ciudad y templo de Jerusalem. Los viajes y caminos que hizieron los Santos Patriarcas, Profetas, Reyes, Cristo Señor Nuestro, su Madre Santissima y los Apostoles mencionados en la Sagrada Escritura; con una breue declaracion de los pesos, medidas y monedas antiguas hebreas, griegas y romanas reduzidas a las nuestras, es un curioso libro de viajes que permanece inédito y del que casi nada se sabe. Presentamos en esta oportunidad las características generales del texto, las particularidades de la edición que estamos realizando y los posibles modelos literarios subyacentes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper estimates a standard version of the New Keynesian monetary (NKM) model under alternative specifications of the monetary policy rule using U.S. and Eurozone data. The estimation procedure implemented is a classical method based on the indirect inference principle. An unrestricted VAR is considered as the auxiliary model. On the one hand, the estimation method proposed overcomes some of the shortcomings of using a structural VAR as the auxiliary model in order to identify the impulse response that defines the minimum distance estimator implemented in the literature. On the other hand, by following a classical approach we can further assess the estimation results found in recent papers that follow a maximum-likelihood Bayesian approach. The estimation results show that some structural parameter estimates are quite sensitive to the specification of monetary policy. Moreover, the estimation results in the U.S. show that the fit of the NKM under an optimal monetary plan is much worse than the fit of the NKM model assuming a forward-looking Taylor rule. In contrast to the U.S. case, in the Eurozone the best fit is obtained assuming a backward-looking Taylor rule, but the improvement is rather small with respect to assuming either a forward-looking Taylor rule or an optimal plan.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Revised: 2006-07

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper proposes a GARCH-type model allowing for time-varying volatility, skewness and kurtosis. The model is estimated assuming a Gram-Charlier series expansion of the normal density function for the error term, which is easier to estimate than the non-central t distribution proposed by Harvey and Siddique (1999). Moreover, this approach accounts for time-varying skewness and kurtosis while the approach by Harvey and Siddique (1999) only accounts for nonnormal skewness. We apply this method to daily returns of a variety of stock indices and exchange rates. Our results indicate a significant presence of conditional skewness and kurtosis. It is also found that specifications allowing for time-varying skewness and kurtosis outperform specifications with constant third and fourth moments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we study the effect of population age distribution upon private consumption expenditure in Spain from 1964 to 1997 using aggregate data. We obtain four main results. First, changes in the population pyramid have substantial effects upon the behaviour of private consumption. Second, the pattern of the coefficients of the demographic variables is not consistent with the simplest version of the life cycle hypothesis. Third, we estimate the impact of the demographic transition upon consumption and find positive values associated with episodes in which the shares of groups of individuals with expenditure levels higher (lower) than the mean increased (decreased). Fourth, the results are robust to alternative specifications for the population age distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper models the mean and volatility spillovers of prices within the integrated Iberian and the interconnected Spanish and French electricity markets. Using the constant (CCC) and dynamic conditional correlation (DCC) bivariate models with three different specifications of the univariate variance processes, we study the extent to which increasing interconnection and harmonization in regulation have favoured price convergence. The data consist of daily prices calculated as the arithmetic mean of the hourly prices over a span from July 1st 2007 until February 29th 2012. The DCC model in which the variances of the univariate processes are specified with a VARMA(1,1) fits the data best for the integrated MIBEL whereas a CCC model with a GARCH(1,1) specification for the univariate variance processes is selected to model the price series in Spain and France. Results show that there are significant mean and volatility spillovers in the MIBEL, indicating strong interdependence between the two markets, while there is a weaker evidence of integration between the Spanish and French markets. We provide new evidence that the EU target of achieving a single electricity market largely depends on increasing trade between countries and homogeneous rules of market functioning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The California Fish and Game Commission (Commission) has the authority to require one or any combination of Bycatch Reduction Device (BRD) types in the trawl fishery within California waters for Pacific ocean shrimp (Pandalus jordani), most commonly referred to as pink shrimp. The purpose of this report is to provide the Commission with the best available information about the BRDs used in the pink shrimp trawl fishery. The mandatory requirement for BRDs occurred in California in 2002, and in Oregon and Washington in 2003, resulting from an effort to minimize bycatch of overfished and quota managed groundfish species. Three types of BRDs currently satisfy the requirement for this device in the California fishery: 1) the Nordmøre grate (rigid-grate excluder); 2) soft-panel excluder; and 3) fisheye excluder; however, the design, specifications, and efficacy differ by BRD type. Although no data has been collected on BRDs directly from the California pink shrimp fishery, extensive research on the efficacy and differences among BRD types has been conducted by the Oregon Department of Fish and Wildlife (ODFW) since the mid-1990s. Rigid-grate excluders are widely considered to be the most effective of the three BRD types at reducing groundfish bycatch. Over 90 percent of the Oregon pink shrimp fleet use rigid-grate excluders. The majority of the current California pink shrimp fleet also uses rigid-grate excluders, according to a telephone survey conducted by the California Department of Fish and Game (Department) in 2007-2008 of pink shrimp fishermen who have been active in the California fishery in recent years. Hinged rigid-grate excluders have been developed in recent years to reduce the bending of the BRD on vessels that employ net reels to stow and deploy their trawl nets, and they have been used successfully on both single- and double-rig vessels in Oregon. Soft-panel excluders have been demonstrated to be effective at reducing groundfish bycatch, although excessive shrimp loss and other problems have also been associated with this design. Fisheye excluders have been used in the California fishery in the past, but they were disapproved in Oregon and Washington in 2003 because they were found to be less effective at reducing groundfish bycatch than other designs. The reputation of the United States west coast pink shrimp fishery as one of the cleanest shrimp fisheries in the world is largely attributed to the effectiveness of BRDs at reducing groundfish bycatch. Nevertheless, BRD research and development is still a relatively new field and additional modifications and methods may further reduce bycatch rates in the pink shrimp fishery.(PDF contains 12 pages.)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study was undertaken by UKOLN on behalf of the Joint Information Systems Committee (JISC) in the period April to September 2008. Application profiles are metadata schemata which consist of data elements drawn from one or more namespaces, optimized for a particular local application. They offer a way for particular communities to base the interoperability specifications they create and use for their digital material on established open standards. This offers the potential for digital materials to be accessed, used and curated effectively both within and beyond the communities in which they were created. The JISC recognized the need to undertake a scoping study to investigate metadata application profile requirements for scientific data in relation to digital repositories, and specifically concerning descriptive metadata to support resource discovery and other functions such as preservation. This followed on from the development of the Scholarly Works Application Profile (SWAP) undertaken within the JISC Digital Repositories Programme and led by Andy Powell (Eduserv Foundation) and Julie Allinson (RRT UKOLN) on behalf of the JISC. Aims and Objectives 1.To assess whether a single metadata AP for research data, or a small number thereof, would improve resource discovery or discovery-to-delivery in any useful or significant way. 2.If so, then to:a.assess whether the development of such AP(s) is practical and if so, how much effort it would take; b.scope a community uptake strategy that is likely to be successful, identifying the main barriers and key stakeholders. 3.Otherwise, to investigate how best to improve cross-discipline, cross-community discovery-to-delivery for research data, and make recommendations to the JISC and others as appropriate. Approach The Study used a broad conception of what constitutes scientific data, namely data gathered, collated, structured and analysed using a recognizably scientific method, with a bias towards quantitative methods. The approach taken was to map out the landscape of existing data centres, repositories and associated projects, and conduct a survey of the discovery-to-delivery metadata they use or have defined, alongside any insights they have gained from working with this metadata. This was followed up by a series of unstructured interviews, discussing use cases for a Scientific Data Application Profile, and how widely a single profile might be applied. On the latter point, matters of granularity, the experimental/measurement contrast, the quantitative/qualitative contrast, the raw/derived data contrast, and the homogeneous/heterogeneous data collection contrast were discussed. The Study report was loosely structured according to the Singapore Framework for Dublin Core Application Profiles, and in turn considered: the possible use cases for a Scientific Data Application Profile; existing domain models that could either be used or adapted for use within such a profile; and a comparison existing metadata profiles and standards to identify candidate elements for inclusion in the description set profile for scientific data. The report also considered how the application profile might be implemented, its relationship to other application profiles, the alternatives to constructing a Scientific Data Application Profile, the development effort required, and what could be done to encourage uptake in the community. The conclusions of the Study were validated through a reference group of stakeholders.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This list does not include the actual frameworks, specifications, standards etc reviewed for the Jisc digital capabilities programme. These are secondary resources - articles, reports, research outcomes and professional reviews - which are sometimes linked to specific frameworks. They were used to help plan the frameworks review, construct the new Jisc digital capabilities framework and to write the accompanying reports. Further down you will find a list of web sites, blog posts and professional resources which provide useful additional information and materials, not necessarily evidence-based and not always drawn on directly for this project.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ENGLISH: During the period extending from late August to early October 1958 the United States Navy Hydrographic Office (now the United States Oceanographic Office) undertook a program of current observations in the western part of Panama Bay. The specifications of the survey called for half-hourly monitoring of currents at three depths at each of six locations, one for thirty days and five for five days. Although not all these objectives were realized because of instrument malfunctions and failures, sufficient data were collected at five stations to provide a fairly detailed description of the current pattern as it existed at the times of observation. This report is concerned first with a discussion of those data and the procedures used to reduce them to the tidal and net current components and second, with the effects on the current pattern of tidal amplitude, bottom topography and bottom friction. SPANISH: Durante el período entre fines de agosto y principios de octubre de 1958, la United States Navy Hydrographic Office (ahora la United States Navy Oceanographic Office) tomó a su cargo un programa para observar las corrientes en la parte occidental de la Bahía de Panamá. De acuerdo con las especificaciones del proyecto, las observaciones de las corrientes debían hacerse cada media hora a tres profundidades en cada una de seis localidades; en una de ellas durante treinta días yen las otras cinco durante cinco días. A pesar de que no todos estos objetivos fueron cumplidos a causa del mal funcionamiento y fallas instrumentales, se recogieron suficientes datos en cinco estaciones, como para proporcionar una descripción bastante detallada de la pauta de las corrientes, tal como existía durante las observaciones. Este informe se refiere, primero, al análisis y examen de dichos datos y a los procedimientos empleados para reducir éstos a los componentes de las corrientes netas y a los componentes de las corrientes durante la mareas, y segundo, al efecto que tienen sobre la pauta de las corrientes la fluctuación de las mareas, la topografía del fondo y la fricción del fondo.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During April 8th-10th, 2008, the Aliance for Coastal Technology (ACT) partner institutions, University of Alaska Fairbanks (UAF), Alaska SeaLife Center (ASLC), and the Oil Spill Recovery Institute (OSRI) hosted a workshop entitled: "Hydrocarbon sensors for oil spill prevention and response" in Seward, Alaska. The main focus was to bring together 29 workshop participants-representing workshop managers, scientists, and technology developers - together to discuss current and future hydrocarbon in-situ, laboratory, and remote sensors as they apply to oil spill prevention and response. [PDF contains 28 pages] Hydrocarbons and their derivatives still remain one of the most important energy sources in the world. To effectively manage these energy sources, proper protocol must be implemented to ensure prevention and responses to oil spills, as there are significant economic and environmental costs when oil spills occur. Hydrocarbon sensors provide the means to detect and monitor oil spills before, during, and after they occur. Capitalizing on the properties of oil, developers have designed in-situ, laboratory, and remote sensors that absorb or reflect the electromagnetic energy at different spectral bands. Workshop participants identified current hydrocarbon sensors (in-situ, laboratory, and remote sensors) and their overall performance. To achieve the most comprehensive understanding of oil spills, multiple sensors will be needed to gather oil spill extent, location, movement, thickness, condition, and classification. No single hydrocarbon sensor has the capability to collect all this information. Participants, therefore, suggested the development of means to combine sensor equipment to effectively and rapidly establish a spill response. As the exploration of oil continues at polar latitudes, sensor equipment must be developed to withstand harsh arctic climates, be able to detect oil under ice, and reduce the need for ground teams because ice extent is far too large of an area to cover. Participants also recognized the need for ground teams because ice extent is far too large of an area to cover. Participants also recognized the need for the U.S. to adopt a multi-agency cooperation for oil spill response, as the majority of issues surounding oil spill response focuses not on the hydrocarbon sensors but on an effective contingency plan adopted by all agencies. It is recommended that the U.S. could model contingency planning based on other nations such as Germany and Norway. Workshop participants were asked to make recommendations at the conclusion of the workshop and are summarized below without prioritization: *Outreach materials must be delivered to funding sources and Congressional delegates regarding the importance of oil spill prevention and response and the development of proper sensors to achieve effective response. *Develop protocols for training resource managers as new sensors become available. *Develop or adopt standard instrument specifications and testing protocols to assist manufacturers in further developing new sensor technology. *As oil exploration continues at polar latitudes, more research and development should be allocated to develop a suite of instruments that are applicable to oil detection under ice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To improve the cod stocks in the Baltic Sea, a number of regulations have recently been established by the International Baltic Sea Fisheries Commission (IBSFC) and the European Commission. According to these, fishermen are obliged to use nets with escape windows (BACOMA nets) with a mesh size of the escape window of 120 mm until end of September 2003. These nets however, retain only fish much larger than the legal minimum landing size would al-low. Due to the present stock structure only few of such large fish are however existent. As a consequence fishermen use a legal alternative net. This is a conventional trawl with a cod-end of 130 mm diamond-shaped meshes (IBSFC-rules of 1st April 2002), to be increased to 140 mm on 1st September 2003, according to the mentioned IBSFC-rule. Due legal alterations of the net by the fishermen (e.g. use of extra stiff net material) these nets have acquired extremely low selective properties, i. e. they catch very small fish and produce great amounts of discards. Due to the increase of the minimum landing size from 35 to 38 cm for cod in the Baltic, the amount of discards has even increased since the beginning of 2003. Experiments have now been carried out with the BACOMAnet on German and Swedish commercial and research vessels since arguments were brought forward that the BACOMA net was not yet sufficiently tested on commercial vessels. The results of all experiments conducted so far, are compiled and evaluated here. As a result of the Swedish, Danish and German initiative and research the European Commission reacted upon this in June 2003 and rejected the increase of the diamond-meshed non-BACOMA net from 130 mm to 140mm in September 2003. To protect the cod stocks in the Baltic Sea more effectively the use of traditional diamond meshed cod-ends with-out escape window are prohibited in community waters without derogation, becoming effective 1st of September 2003. To enable more effective and simplified control of the bottom trawl fishery in the Baltic Sea the principle of a ”One-Net-Rule“ is enforced. This is going to be the BACOMA net, with the meshes of the escape window being 110 mm for the time being. The description of the BACOMA net as given in the IBSFC-rules no.10 (revision of the 28th session, Berlin 2002) concentrates on the cod-end and the escape window but only to a less extent on the design and mesh-composition of the remaining parts of the net, such as belly and funnel and many details. Thus, the present description is not complete and leaves, according to fishermen, ample opportunity for manipulation. An initiative has been started in Germany with joint effort from scientists and the fishery to better describe the entire net and to produce a proposal for a more comprehensive description, leaving less space for manipulation. A proposal in this direction is given here and shall be seen as a starting point for a discussion and development towards an internationally uniform net, which is agreed amongst the fishery, scientists and politicians. The Baltic Sea fishery is invited to comment on this proposal, and recommendations for further improvement and specifications are welcomed. Once the design is agreed by the Baltic Fishermen Association, it shall be proposed to the IBSFC and European Commission via the Baltic Fishermen Association.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A three day workshop on turbidity measurements was held at the Hawaii Institute of Marine Biology from August 3 1 to September 2, 2005. The workshop was attended by 30 participants from industry, coastal management agencies, and academic institutions. All groups recognized common issues regarding the definition of turbidity, limitations of consistent calibration, and the large variety of instrumentation that nominally measure "turbidity." The major recommendations, in order of importance for the coastal monitoring community are listed below: 1. The community of users in coastal ecosystems should tighten instrument design configurations to minimize inter-instrument variability, choosing a set of specifications that are best suited for coastal waters. The IS0 7027 design standard is not tight enough. Advice on these design criteria should be solicited through the ASTM as well as Federal and State regulatory agencies representing the majority of turbidity sensor end users. Parties interested in making turbidity measurements in coastal waters should develop design specifications for these water types rather than relying on design standards made for the analysis of drinking water. 2. The coastal observing groups should assemble a community database relating output of specific sensors to different environmental parameters, so that the entire community of users can benefit from shared information. This would include an unbiased, parallel study of different turbidity sensors, employing a variety of designs and configuration in the broadest range of coastal environments. 3. Turbidity should be used as a measure of relative change in water quality rather than an absolute measure of water quality. Thus, this is a recommendation for managers to develop their own local calibrations. See next recommendation. 4. If the end user specifically wants to use a turbidity sensor to measure a specific water quality parameter such as suspended particle concentration, then direct measurement of that water quality parameter is necessary to correlate with 'turbidity1 for a particular environment. These correlations, however, will be specific to the environment in which they are measured. This works because there are many environments in which water composition is relatively stable but varies in magnitude or concentration. (pdf contains 22 pages)

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern robots are increasingly expected to function in uncertain and dynamically challenging environments, often in proximity with humans. In addition, wide scale adoption of robots requires on-the-fly adaptability of software for diverse application. These requirements strongly suggest the need to adopt formal representations of high level goals and safety specifications, especially as temporal logic formulas. This approach allows for the use of formal verification techniques for controller synthesis that can give guarantees for safety and performance. Robots operating in unstructured environments also face limited sensing capability. Correctly inferring a robot's progress toward high level goal can be challenging.

This thesis develops new algorithms for synthesizing discrete controllers in partially known environments under specifications represented as linear temporal logic (LTL) formulas. It is inspired by recent developments in finite abstraction techniques for hybrid systems and motion planning problems. The robot and its environment is assumed to have a finite abstraction as a Partially Observable Markov Decision Process (POMDP), which is a powerful model class capable of representing a wide variety of problems. However, synthesizing controllers that satisfy LTL goals over POMDPs is a challenging problem which has received only limited attention.

This thesis proposes tractable, approximate algorithms for the control synthesis problem using Finite State Controllers (FSCs). The use of FSCs to control finite POMDPs allows for the closed system to be analyzed as finite global Markov chain. The thesis explicitly shows how transient and steady state behavior of the global Markov chains can be related to two different criteria with respect to satisfaction of LTL formulas. First, the maximization of the probability of LTL satisfaction is related to an optimization problem over a parametrization of the FSC. Analytic computation of gradients are derived which allows the use of first order optimization techniques.

The second criterion encourages rapid and frequent visits to a restricted set of states over infinite executions. It is formulated as a constrained optimization problem with a discounted long term reward objective by the novel utilization of a fundamental equation for Markov chains - the Poisson equation. A new constrained policy iteration technique is proposed to solve the resulting dynamic program, which also provides a way to escape local maxima.

The algorithms proposed in the thesis are applied to the task planning and execution challenges faced during the DARPA Autonomous Robotic Manipulation - Software challenge.