955 resultados para Generalization
Resumo:
Frank Ramsey (1931) estableció ciertas condiciones que deberían cumplirse a fin de evaluar las proposiciones condicionales, conocidas hoy como Test de Ramsey (TR) En este trabajo se muestra que las teorías sobre condicionales contrafácticos de Chisholmj, Stalnaker y D. Lewis, satisfacen el TR y la incompatibilidad de TR con la Teoría de la revisión de creencias (AGM). En la última sección se analiza el comportamiento del TR en la propuesta de G. Grocco y L. Fariñas del Cerro, basada en una generalización del cálculo de Secuentes pero introduciendo la novedad de secuencias auxiliares cuya noción de consecuencia es no-monótona.
Resumo:
El cine como fuente historiográfica es utilizado a partir de la "batalla" que dio Marc Ferro hacia el interior de Annales. Cuando era difícil que se aceptaran otras fuentes que no fueran las gráficas, este historiador logró abrirse camino hasta llegar a ser el director de Annales en los años '70. El discurso de la imagen o el discurso audiovisual representa con mayor verosimilitud lo que el discurso gráfico no alcanza a expresar. Sin embargo, la sinécdoque y la generalización se encuentran a mitad de camino en un juego representativo para complementarse a la hora de comprender y reconstruir el pasado. El aporte del cine puede tomarse como reflejo o representación social y también como una alternativa discursiva. En este sentido, Rosenstone con el apoyo de Hayden White y su análisis de la "historiofotía" son los que incursionan en esta nueva dimensión, esto es la narración audiovisual de la historia
Resumo:
La crítica especializada reconoce dos líneas dominantes en el campo intelectual argentino a partir de las producciones de las décadas del sesenta y sus prolongaciones en los setenta, aunque ambas se inscribirían en una genealogía que implica dos modos diferentes de leer la dicotomía arte/vida, una de las tensiones en las estéticas de las vanguardias históricas. Una de ellas, iniciada en los años cincuenta en torno de la famosa revista Poesía Buenos Aires, dirigida por Raúl Gustavo Aguirre, emerge como consecuencia, principalmente, del procesamiento de la vanguardia francesa [especialmente el surrealismo] y podría ser caracterizada como trascendentalismo poético [Calabrese: 2001 y 2002], un aspecto de la ideología del arte que, representado en plenitud por los poetas llamados "malditos", concibe a la poesía como un modo de vida y al poeta como vate, un ser singular, que se constituye en la poesía misma y que atisba un universo diferente al de la cotidianeidad. Mientras que la segunda de las líneas mencionadas, -surgida entre 1955, con "El solicitante descolocado" de Leónidas Lamborghini y prolongada hasta los '70- conocida como la de los "sesentistas", se instaura en polémica con la índole gratuita y estetizante del arte. Se trata de aproximar el texto a lo que se denominaba el contexto aludiendo a lo histórico-social; para ello se intenta crear un imaginario que establezca una continuidad entre la escritura y un determinado modo de ver el mundo; la literatura, inscripta así en el imperativo sartreano, debe identificarse con la praxis política y con los discursos sociales de procedencia referencial [Dalmaroni: 1993 y 1998]. Ingresan, así, elementos procedentes de la cultura popular [por ejemplo, el tango] y las jergas urbanas que construyen un simulacro de oralidad, como materiales a inscribir en la poesía, enfatizando su índole "antiliteraria": tales elementos, al desplazarse de su contexto de procedencia, generan un efecto de extrañamiento, apto para promover una estimulación de la conciencia crítica y develar los ideologemas vigentes. En este contexto, se trata de problematizar la generalización de esta periodización dicotómica, mediante el estudio de las obras de los tres poetas mencionados, instaladas en un espacio de cruce e hibridaciones entre estos mandatos y la vertiente experimental o más intelectual, por lo que no responderían a los rasgos dominantes que permitirían inscribirlos en uno u otro extremo del campo diseñado.
Resumo:
El objetivo de este trabajo es analizar el crecimiento del comercio minorista en Brasil teniendo en vista la ampliación del mercado interno de consumo que ha sido impulsada por la ampliación del sistema de crédito y por políticas estatales de redistribución de renta y de exención de impuestos. La expansión del mercado implica una profunda reestructuración a nivel productivo que abarca el sector industrial, comercial y de servicios y que se realiza por medio de una constante reorganización espacial y de la transformación gradual de los patrones de consumo. Una de las características principales del crecimiento del mercado interno de consumo es la tendencia a exacerbar las contradicciones centro-periferia generadas por la consolidación de procesos de concentración y centralización espacial de las actividades económicas en determinadas regiones del territorio brasilero.
Resumo:
El autor analiza la evolución de la producción académica argentina sobre migraciones, poniendo el acento sobre los avances registrados desde a partir de los aportes realizados desde la historia local y la microhistoria. Los nuevos trabajos permitieron establecer diversidades temáticas y metodológicas en un clima historiográfico con elementos compartidos, como el papel de las redes sociales examinadas a partir de fuentes personales que hacen posible establecer, en una perspectiva comparada de la labor llevada adelante en la materia, un nuevo punto de partida para una generalizar que no excluya la vida de las personas de la historia.
Resumo:
Frank Ramsey (1931) estableció ciertas condiciones que deberían cumplirse a fin de evaluar las proposiciones condicionales, conocidas hoy como Test de Ramsey (TR) En este trabajo se muestra que las teorías sobre condicionales contrafácticos de Chisholmj, Stalnaker y D. Lewis, satisfacen el TR y la incompatibilidad de TR con la Teoría de la revisión de creencias (AGM). En la última sección se analiza el comportamiento del TR en la propuesta de G. Grocco y L. Fariñas del Cerro, basada en una generalización del cálculo de Secuentes pero introduciendo la novedad de secuencias auxiliares cuya noción de consecuencia es no-monótona.
Resumo:
The Håkon Mosby Mud Volcano is a natural laboratory to study geological, geochemical, and ecological processes related to deep-water mud volcanism. High resolution bathymetry of the Håkon Mosby Mud Volcano was recorded during RV Polarstern expedition ARK-XIX/3 utilizing the multibeam system Hydrosweep DS-2. Dense spacing of the survey lines and slow ship speed (5 knots) provided necessary point density to generate a regular 10 m grid. Generalization was applied to preserve and represent morphological structures appropriately. Contour lines were derived showing detailed topography at the centre of the Håkon Mosby Mud Volcano and generalized contours in the vicinity. We provide a brief introduction to the Håkon Mosby Mud Volcano area and describe in detail data recording and processing methods, as well as the morphology of the area. Accuracy assessment was made to evaluate the reliability of a 10 m resolution terrain model. Multibeam sidescan data were recorded along with depth measurements and show reflectivity variations from light grey values at the centre of the Håkon Mosby Mud Volcano to dark grey values (less reflective) at the surrounding moat.
Resumo:
To deliver sample estimates provided with the necessary probability foundation to permit generalization from the sample data subset to the whole target population being sampled, probability sampling strategies are required to satisfy three necessary not sufficient conditions: (i) All inclusion probabilities be greater than zero in the target population to be sampled. If some sampling units have an inclusion probability of zero, then a map accuracy assessment does not represent the entire target region depicted in the map to be assessed. (ii) The inclusion probabilities must be: (a) knowable for nonsampled units and (b) known for those units selected in the sample: since the inclusion probability determines the weight attached to each sampling unit in the accuracy estimation formulas, if the inclusion probabilities are unknown, so are the estimation weights. This original work presents a novel (to the best of these authors' knowledge, the first) probability sampling protocol for quality assessment and comparison of thematic maps generated from spaceborne/airborne Very High Resolution (VHR) images, where: (I) an original Categorical Variable Pair Similarity Index (CVPSI, proposed in two different formulations) is estimated as a fuzzy degree of match between a reference and a test semantic vocabulary, which may not coincide, and (II) both symbolic pixel-based thematic quality indicators (TQIs) and sub-symbolic object-based spatial quality indicators (SQIs) are estimated with a degree of uncertainty in measurement in compliance with the well-known Quality Assurance Framework for Earth Observation (QA4EO) guidelines. Like a decision-tree, any protocol (guidelines for best practice) comprises a set of rules, equivalent to structural knowledge, and an order of presentation of the rule set, known as procedural knowledge. The combination of these two levels of knowledge makes an original protocol worth more than the sum of its parts. The several degrees of novelty of the proposed probability sampling protocol are highlighted in this paper, at the levels of understanding of both structural and procedural knowledge, in comparison with related multi-disciplinary works selected from the existing literature. In the experimental session the proposed protocol is tested for accuracy validation of preliminary classification maps automatically generated by the Satellite Image Automatic MapperT (SIAMT) software product from two WorldView-2 images and one QuickBird-2 image provided by DigitalGlobe for testing purposes. In these experiments, collected TQIs and SQIs are statistically valid, statistically significant, consistent across maps and in agreement with theoretical expectations, visual (qualitative) evidence and quantitative quality indexes of operativeness (OQIs) claimed for SIAMT by related papers. As a subsidiary conclusion, the statistically consistent and statistically significant accuracy validation of the SIAMT pre-classification maps proposed in this contribution, together with OQIs claimed for SIAMT by related works, make the operational (automatic, accurate, near real-time, robust, scalable) SIAMT software product eligible for opening up new inter-disciplinary research and market opportunities in accordance with the visionary goal of the Global Earth Observation System of Systems (GEOSS) initiative and the QA4EO international guidelines.
Resumo:
Introduction: Chemical composition of water determines its physical properties and character of processes proceeding in it: freezing temperature, volume of evaporation, density, color, transparency, filtration capacity, etc. Presence of chemical elements in water solution confers waters special physical properties exerting significant influence on their circulation, creates necessary conditions for development and inhabitance of flora and fauna, and imparts to the ocean waters some chemical features that radically differ them from the land waters (Alekin & Liakhin, 1984). Hydrochemical information helps to determine elements of water circulation, convection depth, makes it easier to distinguish water masses and gives additional knowledge of climatic variability of ocean conditions. Hydrochemical information is a necessary part of biological research. Water chemical composition can be the governing characteristics determining possibility and limits of use of marine objects, both stationary and moving in sea water. Subject of investigation of hydrochemistry is study of dynamics of chemical composition, i.e. processes of its formation and hydrochemical conditions of water bodies (Alekin & Liakhin 1984). The hydrochemical processes in the Arctic Ocean are the least known. Some information on these processes can be obtained in odd publications. A generalizing study of hydrochemical conditions in the Arctic Ocean based on expeditions conducted in the years 1948-1975 has been carried out by Rusanov et al. (1979). The "Atlas of the World Ocean: the Arctic Ocean" contains a special section "Hydrochemistry" (Gorshkov, 1980). Typical vertical profiles, transects and maps for different depths - 0, 100, 300, 500, 1000, 2000, 3000 m are given in this section for the following parameters: dissolved oxygen, phosphate, silicate, pH and alkaline-chlorine coefficient. The maps were constructed using the data of expeditions conducted in the years 1948-1975. The illustrations reflect main features of distribution of the hydrochemical elements for multi-year period and represent a static image of hydrochemical conditions. Distribution of the hydrochemical elements on the ocean surface is given for two seasons - winter and summer, for the other depths are given mean annual fields. Aim of the present Atlas is description of hydrochemical conditions in the Arctic Ocean on the basis of a greater body of hydrochemical information for the years 1948-2000 and using the up-to-date methods of analysis and electronic forms of presentation of hydrochemical information. The most wide-spread characteristics determined in water samples were used as hydrochemical indices. They are: dissolved oxygen, phosphate, silicate, pH, total alkalinity, nitrite and nitrate. An important characteristics of water salt composition - "salinity" has been considered in the Oceanographic Atlas of the Arctic Ocean (1997, 1998). Presentation of the hydrochemical characteristics in this Hydrochemical Atlas is wider if compared with that of the former Atlas (Gorshkov, 1980). Maps of climatic distribution of the hydrochemical elements were constructed for all the standard depths, and seasonal variability of the hydrochemical parameters is given not only for the surface, but also for the underlying standard depths up to 400 m and including. Statistical characteristics of the hydrochemical elements are given for the first time. Detailed accuracy estimates of initial data and map construction are also given in the Atlas. Calculated values of mean-root deviations, maximum and minimum values of the parameters demonstrate limits of their variability for the analyzed period of observations. Therefore, not only investigations of chemical statics are summarized in the Atlas, but also some elements of chemical dynamics are demonstrated. Digital arrays of the hydrochemical elements obtained in nodes of a regular grid are the new form of characteristics presentation in the Atlas. It should be mentioned that the same grid and the same boxes were used in the Atlas, as those that had been used by creation of the US-Russian climatic Oceanographic Atlas. It allows to combine hydrochemical and oceanographic information of these Atlases. The first block of the digital arrays contains climatic characteristics calculated using direct observational data. These climatic characteristics were not calculated in the regions without observations, and the information arrays for these regions have gaps. The other block of climatic information in a gridded form was obtained with the help of objective analysis of observational data. Procedure of the objective analysis allowed us to obtain climatic estimates of the hydrochemical characteristics for the whole water area of the Arctic Ocean including the regions not covered by observations. Data of the objective analysis can be widely used, in particular, in hydrobiological investigations and in modeling of hydrochemical conditions of the Arctic Ocean. Array of initial measurements is a separate block. It includes all the available materials of hydrochemical observations in the form, as they were presented in different sources. While keeping in mind that this array contains some amount of perverted information, the authors of the Atlas assumed it necessary to store this information in its primary form. Methods of data quality control can be developed in future in the process of hydrochemical information accumulation. It can be supposed that attitude can vary in future to the data that were rejected according to the procedure accepted in the Atlas. The hydrochemical Atlas of the Arctic Ocean is the first specialized and electronic generalization of hydrochemical observations in the Arctic Ocean and finishes the program of joint efforts of Russian and US specialists in preparation of a number of atlases for the Arctic. The published Oceanographic Atlas (1997, 1998), Atlas of Arctic Meteorology and Climate (2000), Ice Atlas of the Arctic Ocean prepared for publication and Hydrochemical Atlas of the Arctic Ocean represent a united series of fundamental generalizations of empirical knowledge of Arctic Ocean nature at climatic level. The Hydrochemical Atlas of the Arctic Ocean was elaborated in the result of joint efforts of the SRC of the RF AARI and IARC. Dr. Ye. Nikiforov was scientific supervisor of the Atlas, Dr. R. Colony was manager on behalf of the USA and Dr. L. Timokhov - on behalf of Russia.
Resumo:
An aerodynamic optimization of the train aerodynamic characteristics in term of front wind action sensitivity is carried out in this paper. In particular, a genetic algorithm (GA) is used to perform a shape optimization study of a high-speed train nose. The nose is parametrically defined via Bézier Curves, including a wider range of geometries in the design space as possible optimal solutions. Using a GA, the main disadvantage to deal with is the large number of evaluations need before finding such optimal. Here it is proposed the use of metamodels to replace Navier-Stokes solver. Among all the posibilities, Rsponse Surface Models and Artificial Neural Networks (ANN) are considered. Best results of prediction and generalization are obtained with ANN and those are applied in GA code. The paper shows the feasibility of using GA in combination with ANN for this problem, and solutions achieved are included.