947 resultados para Bayesian spatial analysis, dengue, socioecological factors


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The archaeological evidence from Late Bronze Age Nuzi has ever since the publication of R.F.S. Starr’s final report in 1939 experienced few attention, leaving the interpretation of the inner structure of this extraordinarily extensively excavated settlement to a thriving philological research. This paper presents a macroscopic spatial analysis of mobile inventories in the domestic areas. Based on the comparison with stationary installations and the formal architectural structure a revised socio-topography is proposed. The combination with the evidence from the investigations of the private archives elucidates the great potential for the consideration of multiple approaches in the future research on the function, meaning and sociology of spaces in Near Eastern Archaeology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. On 12 November 2014 the European mission Rosetta succeeded in delivering a lander, named Philae, on the surface of one of the smallest, low-gravity and most primitive bodies of the solar system, the comet 67P/Churyumov-Gerasimenko (67P). Aims. The aim of this paper is to provide a comprehensive geomorphological and spectrophotometric analysis of Philae's landing site (Agilkia) to give an essential framework for the interpretation of its in situ measurements. Methods. OSIRIS images, coupled with gravitational slopes derived from the 3D shape model based on stereo-photogrammetry were used to interpret the geomorphology of the site. We adopted the Hapke model, using previously derived parameters, to photometrically correct the images in orange filter (649.2 nm). The best approximation to the Hapke model, given by the Akimov parameter-less function, was used to correct the reflectance for the effects of viewing and illumination conditions in the other filters. Spectral analyses on coregistered color cubes were used to retrieve spectrophotometric properties. Results. The landing site shows an average normal albedo of 6.7% in the orange filter with variations of similar to 15% and a global featureless spectrum with an average red spectral slope of 15.2%/100 nm between 480.7 nm (blue filter) and 882.1 nm (near-IR filter). The spatial analysis shows a well-established correlation between the geomorphological units and the photometric characteristics of the surface. In particular, smooth deposits have the highest reflectance a bluer spectrum than the outcropping material across the area. Conclusions. The featureless spectrum and the redness of the material are compatible with the results by other instruments that have suggested an organic composition. The observed small spectral variegation could be due to grain size effects. However, the combination of photometric and spectral variegation suggests that a compositional differentiation is more likely. This might be tentatively interpreted as the effect of the efficient dust-transport processes acting on 67P. High-activity regions might be the original sources for smooth fine-grained materials that then covered Agilkia as a consequence of airfall of residual material. More observations performed by OSIRIS as the comet approaches the Sun would help interpreting the processes that work at shaping the landing site and the overall nucleus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the time-series relationship between housing prices in eight Southern California metropolitan statistical areas (MSAs). First, we perform cointegration tests of the housing price indexes for the MSAs, finding seven cointegrating vectors. Thus, the evidence suggests that one common trend links the housing prices in these eight MSAs, a purchasing power parity finding for the housing prices in Southern California. Second, we perform temporal Granger causality tests revealing intertwined temporal relationships. The Santa Anna MSA leads the pack in temporally causing housing prices in six of the other seven MSAs, excluding only the San Luis Obispo MSA. The Oxnard MSA experienced the largest number of temporal effects from other MSAs, six of the seven, excluding only Los Angeles. The Santa Barbara MSA proved the most isolated in that it temporally caused housing prices in only two other MSAs (Los Angels and Oxnard) and housing prices in the Santa Anna MSA temporally caused prices in Santa Barbara. Third, we calculate out-of-sample forecasts in each MSA, using various vector autoregressive (VAR) and vector error-correction (VEC) models, as well as Bayesian, spatial, and causality versions of these models with various priors. Different specifications provide superior forecasts in the different MSAs. Finally, we consider the ability of theses time-series models to provide accurate out-of-sample predictions of turning points in housing prices that occurred in 2006:Q4. Recursive forecasts, where the sample is updated each quarter, provide reasonably good forecasts of turning points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We examine the time-series relationship between housing prices in Los Angeles, Las Vegas, and Phoenix. First, temporal Granger causality tests reveal that Los Angeles housing prices cause housing prices in Las Vegas (directly) and Phoenix (indirectly). In addition, Las Vegas housing prices cause housing prices in Phoenix. Los Angeles housing prices prove exogenous in a temporal sense and Phoenix housing prices do not cause prices in the other two markets. Second, we calculate out-of-sample forecasts in each market, using various vector autoregessive (VAR) and vector error-correction (VEC) models, as well as Bayesian, spatial, and causality versions of these models with various priors. Different specifications provide superior forecasts in the different cities. Finally, we consider the ability of theses time-series models to provide accurate out-of-sample predictions of turning points in housing prices that occurred in 2006:Q4. Recursive forecasts, where the sample is updated each quarter, provide reasonably good forecasts of turning points.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background/significance. The scarcity of reliable and valid Spanish language instruments for health related research has hindered research with the Hispanic population. Research suggests that fatalistic attitudes are related to poor cancer screening behaviors and may be one reason for low participation of Mexican-Americans in cancer screening. This problem is of major concern because Mexican-Americans constitute the largest Hispanic subgroup in the U.S.^ Purpose. The purposes of this study were: (1) To translate the Powe Fatalism Inventory, (PFI) into Spanish, and culturally adapt the instrument to the Mexican-American culture as found along the U.S.-Mexico border and (2) To test the equivalence between the Spanish translated, culturally adapted version of the PFI and the English version of the PFI to include clarity, content validity, reading level and reliability.^ Design. Descriptive, cross-sectional.^ Methods. The Spanish language translation used a translation model which incorporates a cultural adaptation process. The SPFI was administered to 175 bilingual participants residing in a midsize, U.S-Mexico border city. Data analysis included estimation of Cronbach's alpha, factor analysis, paired samples t-test comparison and multiple regression analysis using SPSS software, as well as measurement of content validity and reading level of the SPFI. ^ Findings. A reliability estimate using Cronbach's alpha coefficient was 0.81 for the SPFI compared to 0.80 for the PFI in this study. Factor Analysis extracted four factors which explained 59% of the variance. Paired t-test comparison revealed no statistically significant differences between the SPFI and PFI total or individual item scores. Content Validity Index was determined to be 1.0. Reading Level was assessed to be less than a 6th grade reading level. The correlation coefficient between the SPFI and PFI was 0.95.^ Conclusions. This study provided strong psychometric evidence that the Spanish translated, culturally adapted SPFI is an equivalent tool to the English version of the PFI in measuring cancer fatalism. This indicates that the two forms of the instrument can be used interchangeably in a single study to accommodate reading and speaking abilities of respondents. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Floods are the leading cause of fatalities related to natural disasters in Texas. Texas leads the nation in flash flood fatalities. From 1959 through 2009 there were three times more fatalities in Texas (840) than the following state Pennsylvania (265). Texas also leads the nation in flood-related injuries (7753). Flood fatalities in Texas represent a serious public health problem. This study addresses several objectives of Healthy People 2010 including reducing deaths from motor vehicle accidents (Objective 15-15), reducing nonfatal motor vehicle injuries (Objective 15-17), and reducing drownings (Objective 15-29). The study examined flood fatalities that occurred in Texas between 1959 and 2008. Flood fatality statistics were extracted from three sources: flood fatality databases from the National Climatic Data Center, the Spatial Hazard Event and Loss Database for the United States, and the Texas Department of State Health Services. The data collected for flood fatalities include the date, time, gender, age, location, and type of flood. Inconsistencies among the three databases were identified and discussed. Analysis reveals that most fatalities result from driving into flood water (77%). Spatial analysis indicates that more fatalities occurred in counties containing major urban centers – some of the Flash Flood Alley counties (Bexar, Dallas, Travis, and Tarrant), Harris County (Houston), and Val Verde County (Del Rio). An intervention strategy targeting the behavior of driving into flood water is proposed. The intervention is based on the Health Belief model. The main recommendation of the study is that flood fatalities in Texas can be reduced through a combination of improved hydrometeorological forecasting, educational programs aimed at enhancing the public awareness of flood risk and the seriousness of flood warnings, and timely and appropriate action by local emergency and safety authorities.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The three articles that comprise this dissertation describe how small area estimation and geographic information systems (GIS) technologies can be integrated to provide useful information about the number of uninsured and where they are located. Comprehensive data about the numbers and characteristics of the uninsured are typically only available from surveys. Utilization and administrative data are poor proxies from which to develop this information. Those who cannot access services are unlikely to be fully captured, either by health care provider utilization data or by state and local administrative data. In the absence of direct measures, a well-developed estimation of the local uninsured count or rate can prove valuable when assessing the unmet health service needs of this population. However, the fact that these are “estimates” increases the chances that results will be rejected or, at best, treated with suspicion. The visual impact and spatial analysis capabilities afforded by geographic information systems (GIS) technology can strengthen the likelihood of acceptance of area estimates by those most likely to benefit from the information, including health planners and policy makers. ^ The first article describes how uninsured estimates are currently being performed in the Houston metropolitan region. It details the synthetic model used to calculate numbers and percentages of uninsured, and how the resulting estimates are integrated into a GIS. The second article compares the estimation method of the first article with one currently used by the Texas State Data Center to estimate numbers of uninsured for all Texas counties. Estimates are developed for census tracts in Harris County, using both models with the same data sets. The results are statistically compared. The third article describes a new, revised synthetic method that is being tested to provide uninsured estimates at sub-county levels for eight counties in the Houston metropolitan area. It is being designed to replicate the same categorical results provided by a current U.S. Census Bureau estimation method. The estimates calculated by this revised model are compared to the most recent U.S. Census Bureau estimates, using the same areas and population categories. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nonresidual concentrations of five trace metals were determined for 322 sediments that were the product of a systematic sampling program of the entire Galveston Bay system. The nonresidual component of the trace metal concentration (e.g. that fraction of the metals that can be relatively easily removed from the sediments without complete destruction of the sediment particle) was considered to be more indicative of the anthropogenic metal pollution that has impacted the Galveston Bay ecosystem.^ For spatial analysis of the metal concentrations, the Galveston Bay system was divided into nine bay-areas, based on easily definable geological and geographical characteristics. Isopleth mapping analyses of these metal concentrations indicated a direct relationship with the $<$63$\mu$m fraction of the sediment (%FINE) in all of the bay areas. Covariate regression analyses indicated that position of the sediment within the Galveston Bay system (e.g. bay-area) was a better predictor of metal concentration than %FINE. Analysis of variance of the metals versus the bay-areas indicated that the five metals maintained a relatively constant order and magnitude of concentration for all the bay-areas.^ The major shipping channels of the Galveston Bay system, with their associated vessels and transported materials, are a likely source of metal pollution. However, these channels were not depositional corridors of high metal concentration. All metal concentration highs were found to be located away from the channels and associated with %FINE highs in the deeper portions of the bay-areas.^ Disturbance of the sediments, by the proposed widening and deepening of these channels, is not predicted to remobilize the trace metals. A more likely adverse effect on the health of the Galveston Bay ecosystem would come from the increase in turbidity of the water due to the dredging and in an extension of the salt water wedge farther north into the bay system. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate quantitative estimation of exposure using retrospective data has been one of the most challenging tasks in the exposure assessment field. To improve these estimates, some models have been developed using published exposure databases with their corresponding exposure determinants. These models are designed to be applied to reported exposure determinants obtained from study subjects or exposure levels assigned by an industrial hygienist, so quantitative exposure estimates can be obtained. ^ In an effort to improve the prediction accuracy and generalizability of these models, and taking into account that the limitations encountered in previous studies might be due to limitations in the applicability of traditional statistical methods and concepts, the use of computer science- derived data analysis methods, predominantly machine learning approaches, were proposed and explored in this study. ^ The goal of this study was to develop a set of models using decision trees/ensemble and neural networks methods to predict occupational outcomes based on literature-derived databases, and compare, using cross-validation and data splitting techniques, the resulting prediction capacity to that of traditional regression models. Two cases were addressed: the categorical case, where the exposure level was measured as an exposure rating following the American Industrial Hygiene Association guidelines and the continuous case, where the result of the exposure is expressed as a concentration value. Previously developed literature-based exposure databases for 1,1,1 trichloroethane, methylene dichloride and, trichloroethylene were used. ^ When compared to regression estimations, results showed better accuracy of decision trees/ensemble techniques for the categorical case while neural networks were better for estimation of continuous exposure values. Overrepresentation of classes and overfitting were the main causes for poor neural network performance and accuracy. Estimations based on literature-based databases using machine learning techniques might provide an advantage when they are applied to other methodologies that combine `expert inputs' with current exposure measurements, like the Bayesian Decision Analysis tool. The use of machine learning techniques to more accurately estimate exposures from literature-based exposure databases might represent the starting point for the independence from the expert judgment.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La inseguridad es uno de los mayores desafíos al que se enfrentan los gobernantes en América Latina. Este problema avanza desde una visión sectorial en los años ochenta hacia una visión transversal a partir de los noventa. Esto implica una evolución de su concepto; desde su consideración como una cuestión de seguridad de Estado de competencia policial y militar hacia la “seguridad humana", concepto multidimensional que contempla el desarrollo humano y la satisfacción de necesidades. En Argentina la inseguridad se agrava desde la crisis social y económica y es parte de la agenda política debido a los constantes reclamos de la sociedad. Sin embargo, con el transcurrir de los años se puede observar la imposibilidad de las gestiones gubernamentales de hacerle frente. Es por ello que en este trabajo se plantea la relación entre la seguridad humana y el Ordenamiento Territorial a través de la evaluación de la habitabilidad, enfoque que permite operacionalizar el concepto de seguridad de forma integradora y transversal. Para el desarrollo del trabajo se utiliza un caso de estudio: el piedemonte del Gran Mendoza. Se parte de la construcción de una metodología de análisis que permite espacializar los datos y de un sistema de variables e indicadores para medir la habitabilidad en términos de la seguridad humana.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El sector del vino experimenta, en las últimas décadas, un vertiginoso proceso de cambio y nuevas dinámicas, que están afectando el desempeño y estrategias de las empresas del sector: internacionalización creciente, caída de los mercados internos de los países tradicionalmente productores, entrada de capital exógeno, etc. La pericia que muestre la organización para adecuarse a la nueva situación se reflejará en la rentabilidad, indicador básico para juzgar la eficiencia en la gestión empresarial. En este estudio, a partir de una muestra representativa de empresas de Castilla-La Mancha, región española con la mayor dimensión vinícola mundial, se ha planteado un modelo econométrico novedoso integrado por variables de desempeño, definidas con la técnica de componentes principales. De los resultados obtenidos se infiere que la rentabilidad de las empresas proviene de: a) su estructura societaria (mayor si son empresas capitalistas que sociedades cooperativas), b) de su tamaño (mejor desempeño a mayor tamaño, aprovechando economías de escala), y c) estructura financiera (mayor rentabilidad si en la composición de la misma priman los recursos propios y liquidez). Por el contrario, la falta de financiación permanente para hacer frente al activo fijo y una dinámica comercial más orientada hacia las ventas de vino a granel a bajo precio, reducen significativamente los ratios de rentabilidad.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coastal managers require reliable spatial data on the extent and timing of potential coastal inundation, particularly in a changing climate. Most sea level rise (SLR) vulnerability assessments are undertaken using the easily implemented bathtub approach, where areas adjacent to the sea and below a given elevation are mapped using a deterministic line dividing potentially inundated from dry areas. This method only requires elevation data usually in the form of a digital elevation model (DEM). However, inherent errors in the DEM and spatial analysis of the bathtub model propagate into the inundation mapping. The aim of this study was to assess the impacts of spatially variable and spatially correlated elevation errors in high-spatial resolution DEMs for mapping coastal inundation. Elevation errors were best modelled using regression-kriging. This geostatistical model takes the spatial correlation in elevation errors into account, which has a significant impact on analyses that include spatial interactions, such as inundation modelling. The spatial variability of elevation errors was partially explained by land cover and terrain variables. Elevation errors were simulated using sequential Gaussian simulation, a Monte Carlo probabilistic approach. 1,000 error simulations were added to the original DEM and reclassified using a hydrologically correct bathtub method. The probability of inundation to a scenario combining a 1 in 100 year storm event over a 1 m SLR was calculated by counting the proportion of times from the 1,000 simulations that a location was inundated. This probabilistic approach can be used in a risk-aversive decision making process by planning for scenarios with different probabilities of occurrence. For example, results showed that when considering a 1% probability exceedance, the inundated area was approximately 11% larger than mapped using the deterministic bathtub approach. The probabilistic approach provides visually intuitive maps that convey uncertainties inherent to spatial data and analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To evaluate these factors, the scale of perceived influences on the election of a specialized plan of studies, IPEP, was built and in this study its factorial structure and reliability were evaluated. The instrument was applied to 115 students, chosen by quote sampling. They were students of a subsidized private secondary school from the city of Chillan, Chile. An exploratory factorial analysis identified eight factors in the scale IPEP: Academic projection, Personal development, Maintenance of the social environment, Academic requirements, Satisfaction of others' expectations, Vocational information, Image of the plan and Family pressure. The results present an initial factorial structure, that empirically and theoretically adequate, whose factors were reliable and conceptually useful. These factors distinguish psychogenic and sociogenic aspects of the vocational process, and allow to initiate diagnostic and investigative actions in relation to this early vocational election that is established in secondary schools of the Chilean educational system

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Creada en 1962 por Sam Walton, una pequeña empresa de distribución norteamericana se convertirá rápidamente en la primera empresa mundial. Desde Bentonville, Arkansas, hasta el interior de China, Wal-Mart muestra un camino bastante clásico que va desde la sede principal de la empresa hacia una proliferación de sus tiendas en una quincena de países. Con una facturación de 405 mil millones de dólares en 2008, la empresa de distribución número uno en el mundo fue sobrepasada en abril de 2009 por la empresa petrolera Exxon Mobil en la clasificación de las quinientas empresas mundiales más importantes. Pero a diferencia de otras grandes transnacionales como Exxon Mobil o Microsoft, Wal-Mart, la primera empresa mundial de distribución, tiene que adaptarse al medio ambiente local con el fin de atraer a un máximo de consumidores. El artículo propone una aproximación a escala múltiple de las estrategias de desarrollo de esta firma internacional. Así, la aplicación de un análisis espacial es apropiada para esclarecer los vínculos existentes entre los territorios y las estrategias de los actores a diferentes escalas geográficas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El presente trabajo se propone identificar y caracterizar las unidades de análisis espacio-territorial (fracciones censales) de la actividad primaria intensiva, específicamente hortícola, y examinar el peso relativo de la actividad del Partido de La Plata en el contexto provincial y nacional. Se analizan las variables sobre mano de obra, superficie total, superficie cultivada a campo y bajo cubierta, régimen de tenencia de la tierra y componentes tecnológicos. Se examinan y comparan los censos hortícola de la Provincia de Buenos Aires 1998 y Nacional Agropecuario 2002 (CNA02). Los resultados son presentados en tablas, gráficos y mapas temáticos procesados en SIG ArcView.