918 resultados para Points and lines
Resumo:
Assessment of changes in precipitation (P) as a function of percentiles of surface temperature (T) and 500 hPa vertical velocity (ω) are presented, considering present-day simulations and observational estimates from the Global Precipitation Climatology Project (GPCP) combined with the European Centre for Medium-range Weather Forecasts Interim reanalysis (ERA Interim). There is a tendency for models to overestimate P in the warm, subsiding regimes compared to GPCP, in some cases by more than 100%, while many models underestimate P in the moderate temperature regimes. Considering climate change projections between 1980–1999 and 2080–2099, responses in P are characterised by dP/dT ≥ 4%/K over the coldest 10–20% of land points and over warm, ascending ocean points while P declines over the warmest, descending regimes (dP/dT ∼ − 4%/K for model ensemble means). The reduced Walker circulation limits this contrasting dP/dT response in the tropical wet and dry regimes only marginally. Around 70% of the global surface area exhibits a consistent sign for dP/dT in at least 6 out of a 7-member model ensemble when considering P composites in terms of dynamic regime.
Resumo:
The performance of flood inundation models is often assessed using satellite observed data; however these data have inherent uncertainty. In this study we assess the impact of this uncertainty when calibrating a flood inundation model (LISFLOOD-FP) for a flood event in December 2006 on the River Dee, North Wales, UK. The flood extent is delineated from an ERS-2 SAR image of the event using an active contour model (snake), and water levels at the flood margin calculated through intersection of the shoreline vector with LiDAR topographic data. Gauged water levels are used to create a reference water surface slope for comparison with the satellite-derived water levels. Residuals between the satellite observed data points and those from the reference line are spatially clustered into groups of similar values. We show that model calibration achieved using pattern matching of observed and predicted flood extent is negatively influenced by this spatial dependency in the data. By contrast, model calibration using water elevations produces realistic calibrated optimum friction parameters even when spatial dependency is present. To test the impact of removing spatial dependency a new method of evaluating flood inundation model performance is developed by using multiple random subsamples of the water surface elevation data points. By testing for spatial dependency using Moran’s I, multiple subsamples of water elevations that have no significant spatial dependency are selected. The model is then calibrated against these data and the results averaged. This gives a near identical result to calibration using spatially dependent data, but has the advantage of being a statistically robust assessment of model performance in which we can have more confidence. Moreover, by using the variations found in the subsamples of the observed data it is possible to assess the effects of observational uncertainty on the assessment of flooding risk.
Resumo:
The task of this paper is to develop a Time-Domain Probe Method for the reconstruction of impenetrable scatterers. The basic idea of the method is to use pulses in the time domain and the time-dependent response of the scatterer to reconstruct its location and shape. The method is based on the basic causality principle of timedependent scattering. The method is independent of the boundary condition and is applicable for limited aperture scattering data. In particular, we discuss the reconstruction of the shape of a rough surface in three dimensions from time-domain measurements of the scattered field. In practise, measurement data is collected where the incident field is given by a pulse. We formulate the time-domain fieeld reconstruction problem equivalently via frequency-domain integral equations or via a retarded boundary integral equation based on results of Bamberger, Ha-Duong, Lubich. In contrast to pure frequency domain methods here we use a time-domain characterization of the unknown shape for its reconstruction. Our paper will describe the Time-Domain Probe Method and relate it to previous frequency-domain approaches on sampling and probe methods by Colton, Kirsch, Ikehata, Potthast, Luke, Sylvester et al. The approach significantly extends recent work of Chandler-Wilde and Lines (2005) and Luke and Potthast (2006) on the timedomain point source method. We provide a complete convergence analysis for the method for the rough surface scattering case and provide numerical simulations and examples.
Resumo:
The nature of private commercial real estate markets presents difficulties for monitoring market performance. Assets are heterogeneous and spatially dispersed, trading is infrequent and there is no central marketplace in which prices and cash flows of properties can be easily observed. Appraisal based indices represent one response to these issues. However, these have been criticised on a number of grounds: that they may understate volatility, lag turning points and be affected by client influence issues. Thus, this paper reports econometrically derived transaction based indices of the UK commercial real estate market using Investment Property Databank (IPD) data, comparing them with published appraisal based indices. The method is similar to that presented by Fisher, Geltner, and Pollakowski (2007) and used by Massachusett, Institute of Technology (MIT) on National Council of Real Estate Investment Fiduciaries (NCREIF) data, although it employs value rather than equal weighting. The results show stronger growth from the transaction based indices in the run up to the peak in the UK market in 2007. They also show that returns from these series are more volatile and less autocorrelated than their appraisal based counterparts, but, surprisingly, differences in turning points were not found. The conclusion then debates the applications and limitations these series have as measures of market performance.
Resumo:
This dataset is an evolving collection of chess endgame record scenarios illustrating the extremes of the game including the deepest positions in various metrics. Optimal lines in consonant strategies are given and annotated. The two attached files are (a) a pgn file of the chess positions and lines, and (b) an annotated version of the pgn file.
Resumo:
Imagery registration is a fundamental step, which greatly affects later processes in image mosaic, multi-spectral image fusion, digital surface modelling, etc., where the final solution needs blending of pixel information from more than one images. It is highly desired to find a way to identify registration regions among input stereo image pairs with high accuracy, particularly in remote sensing applications in which ground control points (GCPs) are not always available, such as in selecting a landing zone on an outer space planet. In this paper, a framework for localization in image registration is developed. It strengthened the local registration accuracy from two aspects: less reprojection error and better feature point distribution. Affine scale-invariant feature transform (ASIFT) was used for acquiring feature points and correspondences on the input images. Then, a homography matrix was estimated as the transformation model by an improved random sample consensus (IM-RANSAC) algorithm. In order to identify a registration region with a better spatial distribution of feature points, the Euclidean distance between the feature points is applied (named the S criterion). Finally, the parameters of the homography matrix were optimized by the Levenberg–Marquardt (LM) algorithm with selective feature points from the chosen registration region. In the experiment section, the Chang’E-2 satellite remote sensing imagery was used for evaluating the performance of the proposed method. The experiment result demonstrates that the proposed method can automatically locate a specific region with high registration accuracy between input images by achieving lower root mean square error (RMSE) and better distribution of feature points.
Resumo:
Technical actions performed by two groups of judokas who won medals at World Championships and Olympic Games during the period 1995-2001 were analyzed. In the Super Elite group (n = 17) were the best athletes in each weight category. The Elite group (n = 16) were medal winners who were not champions and did not win more than three medals. Super Elite judokas used a greater number of throwing techniques which resulted in scores, even when expressed relative to the total number of matches performed, and these techniques were applied in more directions than those of Elite judokas. Further, the number of different throwing techniques and the variability of directions in which techniques were applied were significantly correlated with number of wins and the number of points and ippon scored. Thus, a greater number of throwing techniques and use of directions for attack seem to be important in increasing unpredictability during judo matches.
Resumo:
The importance of the HSO(2) system in atmospheric and combustion chemistry has motivated several works dedicated to the study of associated structures and chemical reactions. Nevertheless controversy still exists in connection with the reaction SH + O(2) -> H + SO(2) and also related to the role of the HSOO isomers in the potential energy surface (PES). Here we report high-level ab initio calculation for the electronic ground state of the HSO(2) system. Energetic, geometric, and frequency properties for the major stationary states of the PES are reported at the same level of calculations:,CASPT2/aug-cc-pV(T+d)Z. This study introduces three new stationary points (two saddle points and one minimum). These structures allow the connection of the skewed HSOOs and the HSO(2) minima defining new reaction paths for SH + O(2) -> H + SO(2) and SH + O(2) -> OH + SO. In addition, the location of the HSOO isomers in the reaction pathways have been clarified.
Resumo:
The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.
Resumo:
We study focal points and Maslov index of a horizontal geodesic gamma : I -> M in the total space of a semi-Riemannian submersion pi : M -> B by determining an explicit relation with the corresponding objects along the projected geodesic pi omicron gamma : I -> B in the base space. We use this result to calculate the focal Maslov index of a (spacelike) geodesic in a stationary spacetime which is orthogonal to a timelike Killing vector field.
Resumo:
Personalized communication is when the marketing message is adapted to each individual by using information from a databaseand utilizing it in the various, different media channels available today. That gives the marketer the possibility to create a campaign that cuts through today’s clutter of marketing messages and gets the recipients attention. PODi is a non-profit organization that was started with the aim of contributing knowledge in the field of digital printingtechnologies. They have created a database of case studies showing companies that have successfully implemented personalizedcommunication in their marketing campaigns. The purpose of the project was therefore to analyze PODi case studies with the main objective of finding out if/how successfully the PODi-cases have been and what made them so successful. To collect the data found in the PODi cases the authors did a content analysis with a sample size of 140 PODi cases from the year 2008 to 2010. The study was carried out by analyzing the cases' measurable ways of success: response rate, conversion rate, visited PURL (personalized URL:s) and ROI (Return On Investment). In order to find out if there were any relationships to be found between the measurable result and what type of industry, campaign objective and media vehicle that was used in the campaign, the authors put up different research uestions to explore that. After clustering and merging the collected data the results were found to be quite spread but shows that the averages of response rates, visited PURL and conversion rates were consistently very high. In the study the authors also collected and summarized what the companies themselves claim to be the reasons for success with their marketing campaigns. The resultshows that the creation of a personalized campaign is complex and dependent on many different variables. It is for instance ofgreat importance to have a well thought-out plan with the campaign and to have good data and insights about the customer in order to perform creative personalization. It is also important to make it easy for the recipient to reply, to use several media vehicles for multiple touch points and to have an attractive and clever design.
Resumo:
There are over 6000 natural resource drilling platforms in the Gulf of Mexico, all of which will become obsolete once their deposits are extracted. This study examined one of the possible alternate uses for these platforms, wind power potential. Using ArcGIS the number of platforms was reduced by weighting their distance from National Data Buoy Center wind speed collection points and water depth. Calculations were done to assess the optimal sites remaining, as well as provide an estimate of the energy potential for each site. Data for this project was obtained from the Minerals Management Service (MMS), United States Geological Service (USGS), and National Data Buoy Center (NDBC). A major limitation of this project was a lack of NDBC wind speed buoys, creating large data gaps and excluding many oil rigs that have otherwise high energy potential.
Resumo:
O objetivo da presente dissertação é analisar a viabilidade de aplicação do Contrato de Gestão na administração pública do Equador e simular um contrato nas empresas públicas do Equador. O primeiro capítulo apresenta uma análise histórica, os "pontos críticos" atuais e uma análise estratégica da administração pública do Equador, baseando-se nas qualificações do pensamento estratégico. O segundo capítulo faz uma descrição do setor empresarial público equatoriano. O terceiro capítulo apresenta um referencial teórico da Administração Estratégica, da Administração por Objetivos e do Contrato de Gestão. O quarto capítulo contem um modelo para implementação do Contrato de Gestão nas empresas públicas equatorianas, incluindo o instrumento jurídico de acordo com o modelo proposto. O presente trabalho chega à conclusão de que a administração pública do Equador se ressente de instrumentos modernos de administração, como é o caso do Contrato de Gestão e que, ao mesmo tempo, possui as condições legais e institucionais para sua aplicação.
Resumo:
The scene prevalecente, in this work was to analyze the capacity of BANRISUL - Bank of the State of the Rio Grande Do Sul, as bank of public control, was assumen of the continued increase of the profitability of the Brazilian banking sector, remaining itself as a brought up to date financial institution tecnologicamente and managemental structuralized by mechanisms of brought up to date taking of decision and permanently revised in compatibility with the increasing instabilidades imposed for the incited competition of the banking sector in adequacy with the macroeconomic volatillidades. On the basis of the extremely positive performance in the analyzed period enters 1997 the 2007, the commercial strategy remains focada in the constant improvement of the rendering of services and in the growth of the credit average small the physical people and the e companies. The abrangência of the attendance points and the ample base of customers are aggregate advantages to an exclusive differential: the Banricompras, the biggest card of proper mark of Latin America. Stronger and income-producing, with an adequate and transparent management, the BANRISUL follows in propósito de to gain position of prominence in the national economic-financial scene.
Resumo:
This work was developed in a financial institution, with the goal of identifying and analyzing the perception of the employees of the areas defined as the resource focus, according the formal dimension of the actual control program implemented in the institution, with the purpose of exploring the vulnerable points and the conflicting related to the increasing of performance of the employee¿s activities and new tools, concepts and news studies case. The work was conducted with the existing base of theories and concepts, following organizational controls, approachs like Elzioni¿s (1964), Amat¿s and Gomes¿s (2001) and Sturdy¿s, Knights¿s and Willmott¿s (1992). The research done was characterized as descriptive because it aims to describe the perceptions, expectations and the employee¿s profiles in the studied organization, such as field research, because it has the objective of promoting interviews and collecting the primary data and documental, because it will also be performed the analysis of the internal documents of the organization. The research also refers to a certain study case with a sectional cut and predominantly quantitative, but with support in quantitative technics for the initial tabulation of data that were analysed afterwords in interpretative form. The characteristics of the financial institution researched of the control program, has been formed predominantly of post-bureaucratic mechanisms focusing in results, in a hegemonic way expanded, of the utility type with strong alienatorian influences in the employees and with low incentive power, related to the increase of the employee¿s compromises. This way, the control program is noticed by the employees as a monitoring mechanism of actions and results, developed only to increase the institution profits, regardless of the impacts of the physical and emotional aspects and increasing, intuitively, the levels of internal dissatisfaction.