29 resultados para Approval of Calendar 2005-2006


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land cover plays a key role in global to regional monitoring and modeling because it affects and is being affected by climate change and thus became one of the essential variables for climate change studies. National and international organizations require timely and accurate land cover information for reporting and management actions. The North American Land Change Monitoring System (NALCMS) is an international cooperation of organizations and entities of Canada, the United States, and Mexico to map land cover change of North America's changing environment. This paper presents the methodology to derive the land cover map of Mexico for the year 2005 which was integrated in the NALCMS continental map. Based on a time series of 250 m Moderate Resolution Imaging Spectroradiometer (MODIS) data and an extensive sample data base the complexity of the Mexican landscape required a specific approach to reflect land cover heterogeneity. To estimate the proportion of each land cover class for every pixel several decision tree classifications were combined to obtain class membership maps which were finally converted to a discrete map accompanied by a confidence estimate. The map yielded an overall accuracy of 82.5% (Kappa of 0.79) for pixels with at least 50% map confidence (71.3% of the data). An additional assessment with 780 randomly stratified samples and primary and alternative calls in the reference data to account for ambiguity indicated 83.4% overall accuracy (Kappa of 0.80). A high agreement of 83.6% for all pixels and 92.6% for pixels with a map confidence of more than 50% was found for the comparison between the land cover maps of 2005 and 2006. Further wall-to-wall comparisons to related land cover maps resulted in 56.6% agreement with the MODIS land cover product and a congruence of 49.5 with Globcover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the main changes of Commons Act 2006 for the registration of land as a town or village green. The purpose of the Commons Act 2006 is to protect common land and promote sustainable farming, public access to the countryside and the interests of wildlife. The changes under s15 of the Commons Act 2006 include the additional 2-year grace period for application, discounting statutory period of closure, correction of mistakes in registers, disallowing severance of rights, voluntary registration, replacement of land in exchange and some other provisions. The transitional provision contained in s15(4) Commons Act 2006 is particularly a cause for controversy as DEFRA has indicated buildings will have to be taken down where development has gone ahead and a subsequent application to register the land as a green is successful, obliging the developer to return the land to a condition consistent with the exercise by locals of recreational rights, which sums up that it would be harder in future to develop land which has the potential to be registered as a town or village green.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The International Conference (series) on Disability, Virtual Reality and Associated Technologies (ICDVRAT) this year held its sixth biennial conference, celebrating ten years of research and development in this field. A total of 220 papers have been presented at the first six conferences, addressing potential, development, exploration and examination of how these technologies can be applied in disabilities research and practice. The research community is broad and multi-disciplined, comprising a variety of scientific and medical researchers, rehabilitation therapists, educators and practitioners. Likewise, technologies, their applications and target user populations are also broad, ranging from sensors positioned on real world objects to fully immersive interactive simulated environments. A common factor is the desire to identify what the technologies have to offer and how they can provide added value to existing methods of assessment, rehabilitation and support for individuals with disabilities. This paper presents a brief review of the first decade of research and development in the ICDVRAT community, defining technologies, applications and target user populations served.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The global radiation balance of the atmosphere is still poorly observed, particularly at the surface. We investigate the observed radiation balance at (1) the surface using the ARM Mobile Facility in Niamey, Niger, and (2) the top of the atmosphere (TOA) over West Africa using data from the Geostationary Earth Radiation Budget (GERB) instrument on board Meteosat-8. Observed radiative fluxes are compared with predictions from the global numerical weather prediction (NWP) version of the Met Office Unified Model (MetUM). The evaluation points to major shortcomings in the NWP model's radiative fluxes during the dry season (December 2005 to April 2006) arising from (1) a lack of absorbing aerosol in the model (mineral dust and biomass burning aerosol) and (2) a poor specification of the surface albedo. A case study of the major Saharan dust outbreak of 6–12 March 2006 is used to evaluate a parameterization of mineral dust for use in the NWP models. The model shows good predictability of the large-scale flow out to 4–5 days with the dust parameterization providing reasonable dust uplift, spatial distribution, and temporal evolution for this strongly forced dust event. The direct radiative impact of the dust reduces net downward shortwave (SW) flux at the surface (TOA) by a maximum of 200 W m−2 (150 W m−2), with a SW heating of the atmospheric column. The impacts of dust on terrestrial radiation are smaller. Comparisons of TOA (surface) radiation balance with GERB (ARM) show the “dusty” forecasts reduce biases in the radiative fluxes and improve surface temperatures and vertical thermodynamic structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A positive salinity anomaly of 0.2 PSU was observed between 50 and 200 m over the years 2000–2001 across the Mozambique Channel at a section at 17°S which was repeated in 2003, 2005, 2006, and 2008. Meanwhile, a moored array is continued from 2003 to 2008. This anomaly was most distinct showing an interannual but nonseasonal variation. The possible origin of the anomaly is investigated using output from three ocean general circulation models (Estimating the Circulation and Climate of the Ocean, Ocean Circulation and Climate Advanced Modeling, and Parallel Ocean Program). The most probable mechanism for the salinity anomaly is the anomalous inflow of subtropical waters caused by a weakening of the northern part of the South Equatorial Current by weaker trade winds. This mechanism was found in all three numerical models. In addition, the numerical models indicate a possible salinization of one of the source water masses to the Mozambique Channel as an additional cause of the anomaly. The anomaly propagated southward into the Agulhas Current and northward along the African coast.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently major processor manufacturers have announced a dramatic shift in their paradigm to increase computing power over the coming years. Instead of focusing on faster clock speeds and more powerful single core CPUs, the trend clearly goes towards multi core systems. This will also result in a paradigm shift for the development of algorithms for computationally expensive tasks, such as data mining applications. Obviously, work on parallel algorithms is not new per se but concentrated efforts in the many application domains are still missing. Multi-core systems, but also clusters of workstations and even large-scale distributed computing infrastructures provide new opportunities and pose new challenges for the design of parallel and distributed algorithms. Since data mining and machine learning systems rely on high performance computing systems, research on the corresponding algorithms must be on the forefront of parallel algorithm research in order to keep pushing data mining and machine learning applications to be more powerful and, especially for the former, interactive. To bring together researchers and practitioners working in this exciting field, a workshop on parallel data mining was organized as part of PKDD/ECML 2006 (Berlin, Germany). The six contributions selected for the program describe various aspects of data mining and machine learning approaches featuring low to high degrees of parallelism: The first contribution focuses the classic problem of distributed association rule mining and focuses on communication efficiency to improve the state of the art. After this a parallelization technique for speeding up decision tree construction by means of thread-level parallelism for shared memory systems is presented. The next paper discusses the design of a parallel approach for dis- tributed memory systems of the frequent subgraphs mining problem. This approach is based on a hierarchical communication topology to solve issues related to multi-domain computational envi- ronments. The forth paper describes the combined use and the customization of software packages to facilitate a top down parallelism in the tuning of Support Vector Machines (SVM) and the next contribution presents an interesting idea concerning parallel training of Conditional Random Fields (CRFs) and motivates their use in labeling sequential data. The last contribution finally focuses on very efficient feature selection. It describes a parallel algorithm for feature selection from random subsets. Selecting the papers included in this volume would not have been possible without the help of an international Program Committee that has provided detailed reviews for each paper. We would like to also thank Matthew Otey who helped with publicity for the workshop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High resolution surface wind fields covering the global ocean, estimated from remotely sensed wind data and ECMWF wind analyses, have been available since 2005 with a spatial resolution of 0.25 degrees in longitude and latitude, and a temporal resolution of 6h. Their quality is investigated through various comparisons with surface wind vectors from 190 buoys moored in various oceanic basins, from research vessels and from QuikSCAT scatterometer data taken during 2005-2006. The NCEP/NCAR and NCDC blended wind products are also considered. The comparisons performed during January-December 2005 show that speeds and directions compare well to in-situ observations, including from moored buoys and ships, as well as to the remotely sensed data. The root-mean-squared differences of the wind speed and direction for the new blended wind data are lower than 2m/s and 30 degrees, respectively. These values are similar to those estimated in the comparisons of hourly buoy measurements and QuickSCAT near real time retrievals. At global scale, it is found that the new products compare well with the wind speed and wind vector components observed by QuikSCAT. No significant dependencies on the QuikSCAT wind speed or on the oceanic region considered are evident.Evaluation of high-resolution surface wind products at global and regional scales

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In 2003 the CAP underwent a significant reform. Despite a seemingly endless turmoil of CAP reform, in 2005 the British government pressed for a new reform debate, and in the European Council meeting of December 2005 secured a commitment for the Commission “to undertake a full, wide ranging review covering all aspects of EU spending, including the CAP, ...” But but the initiative petered out, and the CAP ‘reform’ package proposed by the Commission, and then adopted by the European Parliament and the Council of Ministers in 2013, fell well short of the UK’s initial ambition. The chapter attempts to explore the reasons leading to the UK’s failed policy initiative.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method of clear-air turbulence (CAT) forecasting based on the Lighthill–Ford theory of spontaneous imbalance and emission of inertia–gravity waves has been derived and applied on episodic and seasonal time scales. A scale analysis of this shallow-water theory for midlatitude synoptic-scale flows identifies advection of relative vorticity as the leading-order source term. Examination of leading- and second-order terms elucidates previous, more empirically inspired CAT forecast diagnostics. Application of the Lighthill–Ford theory to the Upper Mississippi and Ohio Valleys CAT outbreak of 9 March 2006 results in good agreement with pilot reports of turbulence. Application of Lighthill–Ford theory to CAT forecasting for the 3 November 2005–26 March 2006 period using 1-h forecasts of the Rapid Update Cycle (RUC) 2 1500 UTC model run leads to superior forecasts compared to the current operational version of the Graphical Turbulence Guidance (GTG1) algorithm, the most skillful operational CAT forecasting method in existence. The results suggest that major improvements in CAT forecasting could result if the methods presented herein become operational.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three gypsiferous-calcareous soils from the Al-Hassa Oasis in Saudi Arabia were examined to determine the conditions under which dissolution of gypsum could be hindered by the formation of coatings of calcite during leaching. Batch extraction with water of a sandy clay loam, a sandy clay and a sandy loam containing 40, 26 and 5% gypsum and 14, 12 and 13% calcite respectively was followed by chemical analysis of the extracts, SEM examination and XRD and EDX microprobe analysis. Extraction in closed centrifuge tubes for I h or 5 h showed that initially gypsum dissolved to give solutions near to equilibrium but then in the sandy clay loam, between one quarter and one third of the gypsum could not dissolve. In the sandy clay about one fifth of the gypsum could not dissolve with none remaining in the sandy loam. All the extracts were close to equilibrium with calcite. SEM and EDX examination showed that coatings of calcite had formed on the gypsum particles. The sandy clay loam was also extracted using an open system in which either air or air +1% CO2 was bubbled through the suspensions for 1 h with stirring. The gypsum dissolved more rapidly and all of the gypsum dissolved. Thus, where the rate of dissolution of gypsum was rapid, calcite did not manage to cover the gypsum surfaces probably because the surface was being continuously removed. Slower leaching conditions in the field are likely to be conducive to the formation of coatings and less dissolution of gypsum. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The conceptual and parameter uncertainty of the semi-distributed INCA-N (Integrated Nutrients in Catchments-Nitrogen) model was studied using the GLUE (Generalized Likelihood Uncertainty Estimation) methodology combined with quantitative experimental knowledge, the concept known as 'soft data'. Cumulative inorganic N leaching, annual plant N uptake and annual mineralization proved to be useful soft data to constrain the parameter space. The INCA-N model was able to simulate the seasonal and inter-annual variations in the stream-water nitrate concentrations, although the lowest concentrations during the growing season were not reproduced. This suggested that there were some retention processes or losses either in peatland/wetland areas or in the river which were not included in the INCA-N model. The results of the study suggested that soft data was a way to reduce parameter equifinality, and that the calibration and testing of distributed hydrological and nutrient leaching models should be based both on runoff and/or nutrient concentration data and the qualitative knowledge of experimentalist. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An efficient method is described for the approximate calculation of the intensity of multiply scattered lidar returns. It divides the outgoing photons into three populations, representing those that have experienced zero, one, and more than one forward-scattering event. Each population is parameterized at each range gate by its total energy, its spatial variance, the variance of photon direction, and the covariance, of photon direction and position. The result is that for an N-point profile the calculation is O(N-2) efficient and implicitly includes up to N-order scattering, making it ideal for use in iterative retrieval algorithms for which speed is crucial. In contrast, models that explicitly consider each scattering order separately are at best O(N-m/m!) efficient for m-order scattering and often cannot be performed to more than the third or fourth order in retrieval algorithms. For typical cloud profiles and a wide range of lidar fields of view, the new algorithm is as accurate as an explicit calculation truncated at the fifth or sixth order but faster by several orders of magnitude. (C) 2006 Optical Society of America.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An outdoor experiment was conducted to increase understanding of apical leaf necrosis in the presence of pathogen infection. Holcus lanatus seeds and Puccinia coronata spores were collected from two adjacent and otherwise similar habitats with differing long-term N fertilization levels. After inoculation, disease and necrosis dynamics were observed during the plant growing seasons of 2003 and 2006. In both years high nutrient availability resulted in earlier disease onset, a higher pathogen population growth rate, earlier physiological apical leaf necrosis onset and a reduced time between disease onset and apical leaf necrosis onset. Necrosis rate was shown to be independent of nutrient availability. The results showed that in these nutrient-rich habitats H. lanatus plants adopted necrosis mechanisms which wasted more nutrients. There was some indication that these necrosis mechanisms were subject to local selection pressures, but these results were not conclusive. The findings of this study are consistent with apical leaf necrosis being an evolved defence mechanism.