939 resultados para INTEGRATED ANALYSIS
Resumo:
An algorithm for solving nonlinear discrete time optimal control problems with model-reality differences is presented. The technique uses Dynamic Integrated System Optimization and Parameter Estimation (DISOPE), which achieves the correct optimal solution in spite of deficiencies in the mathematical model employed in the optimization procedure. A version of the algorithm with a linear-quadratic model-based problem, implemented in the C+ + programming language, is developed and applied to illustrative simulation examples. An analysis of the optimality and convergence properties of the algorithm is also presented.
Resumo:
Two so-called “integrated” polarimetric rate estimation techniques, ZPHI (Testud et al., 2000) and ZZDR (Illingworth and Thompson, 2005), are evaluated using 12 episodes of the year 2005 observed by the French C-band operational Trappes radar, located near Paris. The term “integrated” means that the concentration parameter of the drop size distribution is assumed to be constant over some area and the algorithms retrieve it using the polarimetric variables in that area. The evaluation is carried out in ideal conditions (no partial beam blocking, no ground-clutter contamination, no bright band contamination, a posteriori calibration of the radar variables ZH and ZDR) using hourly rain gauges located at distances less than 60 km from the radar. Also included in the comparison, for the sake of benchmarking, is a conventional Z = 282R1.66 estimator, with and without attenuation correction and with and without adjustment by rain gauges as currently done operationally at Météo France. Under those ideal conditions, the two polarimetric algorithms, which rely solely on radar data, appear to perform as well if not better, pending on the measurements conditions (attenuation, rain rates, …), than the conventional algorithms, even when the latter take into account rain gauges through the adjustment scheme. ZZDR with attenuation correction is the best estimator for hourly rain gauge accumulations lower than 5 mm h−1 and ZPHI is the best one above that threshold. A perturbation analysis has been conducted to assess the sensitivity of the various estimators with respect to biases on ZH and ZDR, taking into account the typical accuracy and stability that can be reasonably achieved with modern operational radars these days (1 dB on ZH and 0.2 dB on ZDR). A +1 dB positive bias on ZH (radar too hot) results in a +14% overestimation of the rain rate with the conventional estimator used in this study (Z = 282R^1.66), a -19% underestimation with ZPHI and a +23% overestimation with ZZDR. Additionally, a +0.2 dB positive bias on ZDR results in a typical rain rate under- estimation of 15% by ZZDR.
Resumo:
Integrated Arable Farming Systems (IAFS), which involve a reduction in the use of off-farm inputs, are attracting considerable research interest in the UK. The objectives of these systems experiments are to compare their financial performance with that from conventional or current farming practices. To date, this comparison has taken little account of any environmental benefits (or disbenefits) of the two systems. The objective of this paper is to review the assessment methodologies available for the analysis of environmental impacts. To illustrate the results of this exercise, the methodology and environmental indicators chosen are then applied to data from one of the LINK - Integrated Farming Systems experimental sites. Data from the Pathhead site in Southern Scotland are used to evaluate the use of invertebrates and nitrate loss as environmental indicators within IAFS. The results suggest that between 1992 and 1995 the biomass of earthworms fell by 28 kg per hectare on the integrated rotation and rose by 31 kg per hectare on the conventional system. This led to environmental costs ranging between £2.24 and £13.44 per hectare for the integrated system and gains of between £2.48 and £14.88 for the conventional system. In terms of nitrate, the integrated system had an estimated loss of £72.21 per hectare in comparison to £149.40 per hectare on the conventional system. Conclusions are drawn about the advantages and disadvantages of this type of analytical framework. Keywords: Farming systems; IAFS; Environmental valuation; Economics; Earthworms; Nitrates; Soil fauna
Resumo:
Tourism is the worlds largest employer, accounting for 10% of jobs worldwide (WTO, 1999). There are over 30,000 protected areas around the world, covering about 10% of the land surface(IUCN, 2002). Protected area management is moving towards a more integrated form of management, which recognises the social and economic needs of the worlds finest areas and seeks to provide long term income streams and support social cohesion through active but sustainable use of resources. Ecotourism - 'responsible travel to natural areas that conserves the environment and improves the well- being of local people' (The Ecotourism Society, 1991) - is often cited as a panacea for incorporating the principles of sustainable development in protected area management. However, few examples exist worldwide to substantiate this claim. In reality, ecotourism struggles to provide social and economic empowerment locally and fails to secure proper protection of the local and global environment. Current analysis of ecotourism provides a useful checklist of interconnected principles for more successful initiatives, but no overall framework of analysis or theory. This paper argues that applying common property theory to the application of ecotourism can help to establish more rigorous, multi-layered analysis that identifies the institutional demands of community based ecotourism (CBE). The paper draws on existing literature on ecotourism and several new case studies from developed and developing countries around the world. It focuses on the governance of CBE initiatives, particularly the interaction between local stakeholders and government and the role that third party non-governmental organisations can play in brokering appropriate institutional arrangements. The paper concludes by offering future research directions."
Resumo:
We propose a new satellite mission to deliver high quality measurements of upper air water vapour. The concept centres around a LiDAR in limb sounding by occultation geometry, designed to operate as a very long path system for differential absorption measurements. We present a preliminary performance analysis with a system sized to send 75 mJ pulses at 25 Hz at four wavelengths close to 935 nm, to up to 5 microsatellites in a counter-rotating orbit, carrying retroreflectors characterized by a reflected beam divergence of roughly twice the emitted laser beam divergence of 15 µrad. This provides water vapour profiles with a vertical sampling of 110 m; preliminary calculations suggest that the system could detect concentrations of less than 5 ppm. A secondary payload of a fairly conventional medium resolution multispectral radiometer allows wide-swath cloud and aerosol imaging. The total weight and power of the system are estimated at 3 tons and 2,700 W respectively. This novel concept presents significant challenges, including the performance of the lasers in space, the tracking between the main spacecraft and the retroreflectors, the refractive effects of turbulence, and the design of the telescopes to achieve a high signal-to-noise ratio for the high precision measurements. The mission concept was conceived at the Alpbach Summer School 2010.
Resumo:
Hardcore, or long-term derelict and vacant brownfield sites which are often contaminated, form a significant proportion of brownfield land in many cities, not only in the UK but also in other countries. The recent economic recession has placed the economic viability of such sites in jeopardy. This paper compares the approaches for bringing back hardcore brownfield sites into use in England and Japan by focusing on ten case studies in Manchester and Osaka, using an `agency'-based frame- work. The findings are set in the context of (i) national brownfield and related policy agendas; (ii) recent trends in land and property markets in both England and Japan; and (iii) city-level comparisons of brownfields in Manchester and Osaka. The research, which was conducted during 2009 ^ 10, suggests that hardcore brownfield sites have been badly affected by the recent recession in both Manchester and Osaka. Despite this, not only is there evidence that hardcore sites have been successfully regenerated in both cities, but also that the critical success factors (CSFs) operating in bringing sites back into use share a large degree of commonality. These CSFs include the presence of strong potential markets, seeing the recession as an opportunity, long-term vision, strong branding, strong partnerships, integrated development, and getting infrastructure into place. Finally, the paper outlines the policy implications of the research.
Resumo:
We describe a one-port de-embedding technique suitable for the quasi-optical characterization of terahertz integrated components at frequencies beyond the operational range of most vector network analyzers. This technique is also suitable when the manufacturing of precision terminations to sufficiently fine tolerances for the application of a TRL de-embedding technique is not possible. The technique is based on vector reflection measurements of a series of easily realizable test pieces. A theoretical analysis is presented for the precision of the technique when implemented using a quasi-optical null-balanced bridge reflectometer. The analysis takes into account quantization effects in the linear and angular encoders associated with the balancing procedure, as well as source power and detector noise equivalent power. The precision in measuring waveguide characteristic impedance and attenuation using this de-embedding technique is further analyzed after taking into account changes in the power coupled due to axial, rotational, and lateral alignment errors between the device under test and the instruments' test port. The analysis is based on the propagation of errors after assuming imperfect coupling of two fundamental Gaussian beams. The required precision in repositioning the samples at the instruments' test-port is discussed. Quasi-optical measurements using the de-embedding process for a WR-8 adjustable precision short at 125 GHz are presented. The de-embedding methodology may be extended to allow the determination of S-parameters of arbitrary two-port junctions. The measurement technique proposed should prove most useful above 325 GHz where there is a lack of measurement standards.
Resumo:
Data from various stations having different measurement record periods between 1988 and 2007 are analyzed to investigate the surface ozone concentration, long-term trends, and seasonal changes in and around Ireland. Time series statistical analysis is performed on the monthly mean data using seasonal and trend decomposition procedures and the Box-Jenkins approach (autoregressive integrated moving average). In general, ozone concentrations in the Irish region are found to have a negative trend at all sites except at the coastal sites of Mace Head and Valentia. Data from the most polluted Dublin city site have shown a very strong negative trend of −0.33 ppb/yr with a 95% confidence limit of 0.17 ppb/yr (i.e., −0.33 ± 0.17) for the period 2002−2007, and for the site near the city of Cork, the trend is found to be −0.20 ± 0.11 ppb/yr over the same period. The negative trend for other sites is more pronounced when the data span is considered from around the year 2000 to 2007. Rural sites of Wexford and Monaghan have also shown a very strong negative trend of −0.99 ± 0.13 and −0.58 ± 0.12, respectively, for the period 2000−2007. Mace Head, a site that is representative of ozone changes in the air advected from the Atlantic to Europe in the marine planetary boundary layer, has shown a positive trend of about +0.16 ± 0.04 ppb per annum over the entire period 1988−2007, but this positive trend has reduced during recent years (e.g., in the period 2001−2007). Cluster analysis for back trajectories are performed for the stations having a long record of data, Mace Head and Lough Navar. For Mace Head, the northern and western clean air sectors have shown a similar positive trend (+0.17 ± 0.02 ppb/yr for the northern sector and +0.18 ± 0.02 ppb/yr for the western sector) for the whole period, but partial analysis for the clean western sector at Mace Head shows different trends during different time periods with a decrease in the positive trend since 1988 indicating a deceleration in the ozone trend for Atlantic air masses entering Europe.
Resumo:
Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.
Resumo:
This paper examines the evolution of knowledge management from the initial knowledge migration stage, through adaptation and creation, to the reverse knowledge migration stage in international joint ventures (IJVs). While many studies have analyzed these stages (mostly focusing on knowledge transfer), we investigated the path-dependent nature of knowledge flow in IJVs. The results from the empirical analysis based on a survey of 136 Korean parent companies of IJVs reveal that knowledge management in IJVs follows a sequential, multi-stage process, and that the knowledge transferred from parents to IJVs must first be adapted within its new environment before it reaches the creation stage. We also found that only created knowledge is transferred back to parents.
Resumo:
Land cover plays a key role in global to regional monitoring and modeling because it affects and is being affected by climate change and thus became one of the essential variables for climate change studies. National and international organizations require timely and accurate land cover information for reporting and management actions. The North American Land Change Monitoring System (NALCMS) is an international cooperation of organizations and entities of Canada, the United States, and Mexico to map land cover change of North America's changing environment. This paper presents the methodology to derive the land cover map of Mexico for the year 2005 which was integrated in the NALCMS continental map. Based on a time series of 250 m Moderate Resolution Imaging Spectroradiometer (MODIS) data and an extensive sample data base the complexity of the Mexican landscape required a specific approach to reflect land cover heterogeneity. To estimate the proportion of each land cover class for every pixel several decision tree classifications were combined to obtain class membership maps which were finally converted to a discrete map accompanied by a confidence estimate. The map yielded an overall accuracy of 82.5% (Kappa of 0.79) for pixels with at least 50% map confidence (71.3% of the data). An additional assessment with 780 randomly stratified samples and primary and alternative calls in the reference data to account for ambiguity indicated 83.4% overall accuracy (Kappa of 0.80). A high agreement of 83.6% for all pixels and 92.6% for pixels with a map confidence of more than 50% was found for the comparison between the land cover maps of 2005 and 2006. Further wall-to-wall comparisons to related land cover maps resulted in 56.6% agreement with the MODIS land cover product and a congruence of 49.5 with Globcover.
Resumo:
The network paradigm has been highly influential in spatial analysis in the globalisation era. As economies across the world have become increasingly integrated, so-called global cities have come to play a growing role as central nodes in the networked global economy. The idea that a city’s position in global networks benefits its economic performance has resulted in a competitive policy focus on promoting the economic growth of cities by improving their network connectivity. However, in spite of the attention being given to boosting city connectivity little is known about whether this directly translates to improved city economic performance and, if so, how well connected a city needs to be in order to benefit from this. In this paper we test the relationship between network connectivity and economic performance between 2000 and 2008 for cities with over 500,000 inhabitants in Europe and the USA to inform European policy.
Resumo:
Human brain imaging techniques, such as Magnetic Resonance Imaging (MRI) or Diffusion Tensor Imaging (DTI), have been established as scientific and diagnostic tools and their adoption is growing in popularity. Statistical methods, machine learning and data mining algorithms have successfully been adopted to extract predictive and descriptive models from neuroimage data. However, the knowledge discovery process typically requires also the adoption of pre-processing, post-processing and visualisation techniques in complex data workflows. Currently, a main problem for the integrated preprocessing and mining of MRI data is the lack of comprehensive platforms able to avoid the manual invocation of preprocessing and mining tools, that yields to an error-prone and inefficient process. In this work we present K-Surfer, a novel plug-in of the Konstanz Information Miner (KNIME) workbench, that automatizes the preprocessing of brain images and leverages the mining capabilities of KNIME in an integrated way. K-Surfer supports the importing, filtering, merging and pre-processing of neuroimage data from FreeSurfer, a tool for human brain MRI feature extraction and interpretation. K-Surfer automatizes the steps for importing FreeSurfer data, reducing time costs, eliminating human errors and enabling the design of complex analytics workflow for neuroimage data by leveraging the rich functionalities available in the KNIME workbench.
Resumo:
Prior literature showed that Felder and Silverman learning styles model (FSLSM) was widely adopted to cater to individual styles of learners whether in traditional or Technology Enhanced Learning (TEL). In order to infer this model, the Index of Learning Styles (ILS) instrument was proposed. This research aims to analyse the soundness of this instrument in an Arabic sample. Data were integrated from different courses and years. A total of 259 engineering students participated voluntarily in the study. The reliability was analysed by applying internal construct reliability, inter-scale correlation, and total item correlation. The construct validity was also considered by running factor analysis. The overall results indicated that the reliability and validity of perception and input dimensions were moderately supported, whereas processing and understanding dimensions showed low internal-construct consistency and their items were weakly loaded in the associated constructs. Generally, the instrument needs further effort to improve its soundness. However, considering the consistency of the produced results of engineering students irrespective of cross-cultural differences, it can be adopted to diagnose learning styles.
Resumo:
Complex products such as manufacturing equipment have always needed maintenance and repair services. Increasingly, leading manufacturers are integrating products and services to generate increased revenues and achieve customer satisfaction. Designing integrated products and services requires a different approach to new product development and a clear understanding of how customers perceive the value they obtain from actual usage of products and services—so-called value-in-use. However, there is a lack of research on integrated products and services and how they impact customer satisfaction. An exploratory study was undertaken to understand customers’ views on integrated products and services and the value-in-use derived from such offerings. As value-in-use and its impacts are complicated concepts, a technique from psychology—Repertory Grid Technique—was used to gather data in 33 interviews. The interviews allowed a deep understanding of customer views on integrated products and services to be obtained, and a systematic analysis identified the key attributes of value-in-use. In order to probe further, the data were then analyzed using Honey’s procedure, which identified the impact of the attributes of value-in-use on customer satisfaction. Two key attributes—relational dynamic and access—were found to have the most influence on customer satisfaction. This paper contributes to the innovation field by identifying customer needs for integrated products and services and how these impact customer satisfaction. These are key points and need to be fully considered by managers during new product and service development. Similarly, the paper identifies a number of important areas for further research.