922 resultados para best-possible bounds


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents hierarchical clustering algorithms for land cover mapping problem using multi-spectral satellite images. In unsupervised techniques, the automatic generation of number of clusters and its centers for a huge database is not exploited to their full potential. Hence, a hierarchical clustering algorithm that uses splitting and merging techniques is proposed. Initially, the splitting method is used to search for the best possible number of clusters and its centers using Mean Shift Clustering (MSC), Niche Particle Swarm Optimization (NPSO) and Glowworm Swarm Optimization (GSO). Using these clusters and its centers, the merging method is used to group the data points based on a parametric method (k-means algorithm). A performance comparison of the proposed hierarchical clustering algorithms (MSC, NPSO and GSO) is presented using two typical multi-spectral satellite images - Landsat 7 thematic mapper and QuickBird. From the results obtained, we conclude that the proposed GSO based hierarchical clustering algorithm is more accurate and robust.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Online remote visualization and steering of critical weather applications like cyclone tracking are essential for effective and timely analysis by geographically distributed climate science community. A steering framework for controlling the high-performance simulations of critical weather events needs to take into account both the steering inputs of the scientists and the criticality needs of the application including minimum progress rate of simulations and continuous visualization of significant events. In this work, we have developed an integrated user-driven and automated steering framework INST for simulations, online remote visualization, and analysis for critical weather applications. INST provides the user control over various application parameters including region of interest, resolution of simulation, and frequency of data for visualization. Unlike existing efforts, our framework considers both the steering inputs and the criticality of the application, namely, the minimum progress rate needed for the application, and various resource constraints including storage space and network bandwidth to decide the best possible parameter values for simulations and visualization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An updated catalog of earthquakes has been prepared for the Andaman-Nicobar and adjoining regions. The catalog was homogenized to a unified magnitude scale, and declustering of the catalog was performed to remove aftershocks and foreshocks. Eleven regional source zones were identified in the study area to account for local variability in seismicity characteristics. The seismicity parameters were estimated for each of these source zones, and the seismic hazard evaluation of the Andaman-Nicobar region has been performed using different source models and attenuation relations. Probabilistic seismic hazard analysis has been performed with currently available data and their best possible scientific interpretation using an appropriate instrument such as the logic tree to explicitly account for epistemic uncertainty by considering alternative models (source models, maximum magnitude, and attenuation relationships). The hazard maps for different periods have been produced for horizontal ground motion on the bedrock level.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Online remote visualization and steering of critical weather applications like cyclone tracking are essential for effective and timely analysis by geographically distributed climate science community. A steering framework for controlling the high-performance simulations of critical weather events needs to take into account both the steering inputs of the scientists and the criticality needs of the application including minimum progress rate of simulations and continuous visualization of significant events. In this work, we have developed an integrated user-driven and automated steering framework InSt for simulations, online remote visualization, and analysis for critical weather applications. InSt provides the user control over various application parameters including region of interest, resolution of simulation, and frequency of data for visualization. Unlike existing efforts, our framework considers both the steering inputs and the criticality of the application, namely, the minimum progress rate needed for the application, and various resource constraints including storage space and network bandwidth to decide the best possible parameter values for simulations and visualization.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In the design of practical web page classification systems one often encounters a situation in which the labeled training set is created by choosing some examples from each class; but, the class proportions in this set are not the same as those in the test distribution to which the classifier will be actually applied. The problem is made worse when the amount of training data is also small. In this paper we explore and adapt binary SVM methods that make use of unlabeled data from the test distribution, viz., Transductive SVMs (TSVMs) and expectation regularization/constraint (ER/EC) methods to deal with this situation. We empirically show that when the labeled training data is small, TSVM designed using the class ratio tuned by minimizing the loss on the labeled set yields the best performance; its performance is good even when the deviation between the class ratios of the labeled training set and the test set is quite large. When the labeled training data is sufficiently large, an unsupervised Gaussian mixture model can be used to get a very good estimate of the class ratio in the test set; also, when this estimate is used, both TSVM and EC/ER give their best possible performance, with TSVM coming out superior. The ideas in the paper can be easily extended to multi-class SVMs and MaxEnt models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents an improved hierarchical clustering algorithm for land cover mapping problem using quasi-random distribution. Initially, Niche Particle Swarm Optimization (NPSO) with pseudo/quasi-random distribution is used for splitting the data into number of cluster centers by satisfying Bayesian Information Criteria (BIC). Themain objective is to search and locate the best possible number of cluster and its centers. NPSO which highly depends on the initial distribution of particles in search space is not been exploited to its full potential. In this study, we have compared more uniformly distributed quasi-random with pseudo-random distribution with NPSO for splitting data set. Here to generate quasi-random distribution, Faure method has been used. Performance of previously proposed methods namely K-means, Mean Shift Clustering (MSC) and NPSO with pseudo-random is compared with the proposed approach - NPSO with quasi distribution(Faure). These algorithms are used on synthetic data set and multi-spectral satellite image (Landsat 7 thematic mapper). From the result obtained we conclude that use of quasi-random sequence with NPSO for hierarchical clustering algorithm results in a more accurate data classification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Contrary to the actual nonlinear Glauber model, the linear Glauber model (LGM) is exactly solvable, although the detailed balance condition is not generally satisfied. This motivates us to address the issue of writing the transition rate () in a best possible linear form such that the mean squared error in satisfying the detailed balance condition is least. The advantage of this work is that, by studying the LGM analytically, we will be able to anticipate how the kinetic properties of an arbitrary Ising system depend on the temperature and the coupling constants. The analytical expressions for the optimal values of the parameters involved in the linear are obtained using a simple Moore-Penrose pseudoinverse matrix. This approach is quite general, in principle applicable to any system and can reproduce the exact results for one dimensional Ising system. In the continuum limit, we get a linear time-dependent Ginzburg-Landau equation from the Glauber's microscopic model of non-conservative dynamics. We analyze the critical and dynamic properties of the model, and show that most of the important results obtained in different studies can be reproduced by our new mathematical approach. We will also show in this paper that the effect of magnetic field can easily be studied within our approach; in particular, we show that the inverse of relaxation time changes quadratically with (weak) magnetic field and that the fluctuation-dissipation theorem is valid for our model.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In optical character recognition of very old books, the recognition accuracy drops mainly due to the merging or breaking of characters. In this paper, we propose the first algorithm to segment merged Kannada characters by using a hypothesis to select the positions to be cut. This method searches for the best possible positions to segment, by taking into account the support vector machine classifier's recognition score and the validity of the aspect ratio (width to height ratio) of the segments between every pair of cut positions. The hypothesis to select the cut position is based on the fact that a concave surface exists above and below the touching portion. These concave surfaces are noted down by tracing the valleys in the top contour of the image and similarly doing it for the image rotated upside-down. The cut positions are then derived as closely matching valleys of the original and the rotated images. Our proposed segmentation algorithm works well for different font styles, shapes and sizes better than the existing vertical projection profile based segmentation. The proposed algorithm has been tested on 1125 different word images, each containing multiple merged characters, from an old Kannada book and 89.6% correct segmentation is achieved and the character recognition accuracy of merged words is 91.2%. A few points of merge are still missed due to the absence of a matched valley due to the specific shapes of the particular characters meeting at the merges.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Las voluntades anticipadas se deben distinguir del consentimiento informado ya que resultan más amplias que éste: la manifestación de la persona no se limita a la aceptación de lo que el facultativo le ha propuesto en un momento determinado. Parece lógico pensar que las decisiones deberían tomarse basándose en el conocimiento y en la comprensión de los datos médicos, los pronósticos de la enfermedad y los objetivos conversados entre paciente y profesional, y no bajo la presión de distintas circunstancias. En este contexto las decisiones sobre “voluntades anticipadas” presentan un estatus bioético particular, que es necesario evaluar para lograr tomar la decisión más correcta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This document describes the analytical methods used to quantify core organic chemicals in tissue and sediment collected as part of NOAA’s National Status and Trends Program (NS&T) for the years 2000-2006. Organic contaminat analytical methods used during the early years of the program are described in NOAA Technical Memoranda NOS ORCA 71 and 130 (Lauenstein and Cantillo, 1993; Lauenstein and Cantillo, 1998) for the years 1984-1992 and 1993-1996, respectively. These reports are available from our website (http://www.ccma.nos.gov) The methods detailed in this document were utilized by the Mussel Watch Project and Bioeffects Project, which are both part of the NS&T program. The Mussel Watch Project has been monitoring contaminants in bivalves and sediments since 1986 and is the longest active national contaminant monitoring program operating in U.S. costal waters. Approximately 280 Mussel Watch sites are sampled on a biennial and decadal timescale for bivalve tissue and sediment respectively. Similarly, the Bioeffects Assessment Project began in 1986 to characterize estuaries and near coastal environs. Using the sediment quality triad approach that measures; (1) levels of contaminants in sediments, (2) incidence and severity of toxicity, and (3) benthic macrofaunal conmmunities, the Bioeffects Project describes the spatial extent of sediment toxicity. Contaminant assessment is a core function of both projects. These methods, while discussed here in the context of sediment and bivalve tissue, were also used with other matricies including: fish fillet, fish liver, nepheloid layer, and suspended particulate matter. The methods described herein are for the core organic contaminants monitored in the NS&T Program and include polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), butyltins, and organochlorines that have been analyzed consistently over the past 15-20 years. Organic contaminants such as dioxins, perfluoro compounds and polybrominated biphenyl ethers (PBDEs) were analyzed periodically in special studies of the NS&T Program and will be described in another document. All of the analytical techniques described in this document were used by B&B Laboratories, Inc, an affiliate of TDI-Brook International, Inc. in College Station, Texas under contract to NOAA. The NS&T Program uses a performance-based system approach to obtain the best possible data quality and comparability, and requires laboratories to demonstrate precision, accuracy, and sensitivity to ensure results-based performance goals and measures. (PDF contains 75 pages)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This workshop was organized because of the increase between 1978 and 1980 in coastwide landings of widow rockfish, from less than 1,000 mt to more than 20,000 mt, and because of scientists' concern with the lack of knowledge both of the fishery and biology of the species. Most scientists active in research on Pacific groundfish, as well as some members of the fishing industry and fishery managers, attended the workshop. These proceedings contain the report of the workshop discussion panel, status reports on California, Oregon, and Washington fisheries through 1980, and a collection of seven papers presented at the workshop. The status reports provide an historical perspective of the development of an important fishery. The papers present a fairly complete survey of biological knowledge of widow rockfish, economic status of the fishery, and fishery-independent methods for estimation of abundance. The papers also contain some information developed after the workshop. Since the workshop, the fishery has matured. Largest landings were made in 1981, when more than 28,000 mt were landed. Maximum sustainable yield (MSY) is estimated to be slightly less than 10,000 mt, and the stock appeared to be at about the MSY level in 1985. The Pacific Fishery Management Council and National Marine Fisheries Service have implemented regulations that have maintained landings since 1983 at approximately the maximum sustainable yield level. Fishery-dependent stock assessments are being made on an annual basis for the Pacific Fishery Management Council. While these assessments are considered to be the best possible with available data, scientists responsible for the assessment have chosen to delay their publication in the formal scientific literature until more data are obtained. However, the stock assessment reports are available from the Pacific Fishery Management Council. In addition to the papers in this collection, three papers have been published on widow rockfish since 1980. BoehIert, Barss, and Lamberson (1982) estimate fecundity of the species off Oregon; Gunderson (1984) describes the fishery and management actions; and Laroche and Richardson (1981) describe the morphology and distribution of juvenile widow rockfish off Oregon. During the past decade, the fishery for widow rockfish has developed from a minor fishery to one of the more important on the Pacific Coast. Our knowledge of the biology and dynamics of the species has progressed from minimal to relatively extensive for a groundfish species. It is our intention in preparing this collection of papers to make this knowledge readily available to the scientific community. (PDF file contains 63 pages.)

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An examination is made of the requirements for the commercial propagation of carp and tilapia in Nigeria. It is concluded that the operation of a successful Fish Hatchery and fry production system will depend on the following factors: 1) Correct initial planning for intended production (both species and intended numbers); 2) Design of the appropriate facilities to enable required production; 3) Selection of top calibre, dedicated and experienced hatchery staff; and, 4) The ethical responsibility taken by the hatchery management to produce only the highest quality seed under the best possible conditions. Purchasing farmers are dependent on this

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An economic air pollution control model, which determines the least cost of reaching various air quality levels, is formulated. The model takes the form of a general, nonlinear, mathematical programming problem. Primary contaminant emission levels are the independent variables. The objective function is the cost of attaining various emission levels and is to be minimized subject to constraints that given air quality levels be attained.

The model is applied to a simplified statement of the photochemical smog problem in Los Angeles County in 1975 with emissions specified by a two-dimensional vector, total reactive hydrocarbon, (RHC), and nitrogen oxide, (NOx), emissions. Air quality, also two-dimensional, is measured by the expected number of days per year that nitrogen dioxide, (NO2), and mid-day ozone, (O3), exceed standards in Central Los Angeles.

The minimum cost of reaching various emission levels is found by a linear programming model. The base or "uncontrolled" emission levels are those that will exist in 1975 with the present new car control program and with the degree of stationary source control existing in 1971. Controls, basically "add-on devices", are considered here for used cars, aircraft, and existing stationary sources. It is found that with these added controls, Los Angeles County emission levels [(1300 tons/day RHC, 1000 tons /day NOx) in 1969] and [(670 tons/day RHC, 790 tons/day NOx) at the base 1975 level], can be reduced to 260 tons/day RHC (minimum RHC program) and 460 tons/day NOx (minimum NOx program).

"Phenomenological" or statistical air quality models provide the relationship between air quality and emissions. These models estimate the relationship by using atmospheric monitoring data taken at one (yearly) emission level and by using certain simple physical assumptions, (e. g., that emissions are reduced proportionately at all points in space and time). For NO2, (concentrations assumed proportional to NOx emissions), it is found that standard violations in Central Los Angeles, (55 in 1969), can be reduced to 25, 5, and 0 days per year by controlling emissions to 800, 550, and 300 tons /day, respectively. A probabilistic model reveals that RHC control is much more effective than NOx control in reducing Central Los Angeles ozone. The 150 days per year ozone violations in 1969 can be reduced to 75, 30, 10, and 0 days per year by abating RHC emissions to 700, 450, 300, and 150 tons/day, respectively, (at the 1969 NOx emission level).

The control cost-emission level and air quality-emission level relationships are combined in a graphical solution of the complete model to find the cost of various air quality levels. Best possible air quality levels with the controls considered here are 8 O3 and 10 NO2 violations per year (minimum ozone program) or 25 O3 and 3 NO2 violations per year (minimum NO2 program) with an annualized cost of $230,000,000 (above the estimated $150,000,000 per year for the new car control program for Los Angeles County motor vehicles in 1975).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Part I: The mobilities of photo-generated electrons and holes in orthorhombic sulfur are determined by drift mobility techniques. At room temperature electron mobilities between 0.4 cm2/V-sec and 4.8 cm2/V-sec and hole mobilities of about 5.0 cm2/V-sec are reported. The temperature dependence of the electron mobility is attributed to a level of traps whose effective depth is about 0.12 eV. This value is further supported by both the voltage dependence of the space-charge-limited, D.C. photocurrents and the photocurrent versus photon energy measurements.

As the field is increased from 10 kV/cm to 30 kV/cm a second mechanism for electron transport becomes appreciable and eventually dominates. Evidence that this is due to impurity band conduction at an appreciably lower mobility (4.10-4 cm2/V-sec) is presented. No low mobility hole current could be detected. When fields exceeding 30 kV/cm for electron transport and 35 kV/cm for hole transport are applied, avalanche phenomena are observed. The results obtained are consistent with recent energy gap studies in sulfur.

The theory of the transport of photo-generated carriers is modified to include the case of appreciable thermos-regeneration from the traps in one transit time.

Part II: An explicit formula for the electric field E necessary to accelerate an electron to a steady-state velocity v in a polarizable crystal at arbitrary temperature is determined via two methods utilizing Feynman Path Integrals. No approximation is made regarding the magnitude of the velocity or the strength of the field. However, the actual electron-lattice Coulombic interaction is approximated by a distribution of harmonic oscillator potentials. One may be able to find the “best possible” distribution of oscillators using a variational principle, but we have not been able to find the expected criterion. However, our result is relatively insensitive to the actual distribution of oscillators used, and our E-v relationship exhibits the physical behavior expected for the polaron. Threshold fields for ejecting the electron for the polaron state are calculated for several substances using numerical results for a simple oscillator distribution.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

[ES]El objeto del presente trabajo consiste en diseñar un conjunto de estriberas y semimanillares para una moto de competición, que cumplan con una serie de requerimientos geométricos y que garanticen las mejores prestaciones posibles en cuanto a competición, entre las que destacan la seguridad y comodidad del piloto. A continuación se procederá a elegir los procesos más adecuados para la fabricación de los mismos, teniendo en cuenta requisitos económicos y de calidad, para sacar al mercado un producto lo más competitivo posible. El estudio se basa principalmente en el análisis de diferentes alternativas que se pueden adoptar para obtener el diseño de los productos y posterior fabricación. Para ello se estudiarán los objetivos y condiciones que se deben cumplir. Cabe destacar que la idea de desarrollo de este proyecto surgió dentro de otro de mayor envergadura, la competición MotoStudent. MotoStudent es una competición internacional entre universidades de todo el mundo en la que los equipos de estudiantes se enfrentan al desafío de diseñar y desarrollar un prototipo de motocicleta de competición similar a la categoría mundialista de Moto3.